45 results
Search Results
Now showing 1 - 10 of 45
Review Citation - WoS: 16Citation - Scopus: 20Assessing the Coverage of E-Health Services in Sub-Saharan Africa a Systematic Review and Analysis(Georg Thieme verlag Kg, 2017) Adeloye, Davies; Adigun, Taiwo; Misra, Sanjay; Omoregbe, NicholasBackground: E-Health has attracted growing interests globally. The relative lack of facilities, skills, funds and information on existing e-Health initiatives has affected progress on e-Health in Africa. Objectives: To review publicly available literature on e-Health in sub-Saharan Africa (sSA) towards providing information on existing and ongoing e-Health initiatives in the region. Methods: Searches of relevant literature were conducted on Medline, EMBASE and Global Health, with search dates set from 1990 to 2016. We included studies on e-Health initiatives (prototypes, designs, or completed projects) targeting population groups in sSA. Results: Our search returned 2322 hits, with 26 studies retained. Included studies were conducted in 14 countries across the four sub-regions in sSA (Central, East, South and West) and spreading over a 12-year period, 2002-2014. Six types of e-Health interventions were reported, with 17 studies (65%) based on telemedicine, followed by mHealth with 5 studies (19%). Other e-Health types include expert system, electronic medical records, e-mails, and online health module. Specific medical specialties covered include dermatology (19%), pathology (12%) and radiology (8%). Successes were 'widely reported' (representing 50% overall acceptance or positive feedbacks in a study) in 10 studies (38%). The prominent challenges reported were technical problems, poor inter net and connectivity, participants' selection biases, contextual issues, and lack of funds. Conclusion: E-Health is evolving in sSA, but with poorly published evidence. While we call for more quality research in the region, it is also important that population-wide policies and on-going e-Health initiatives are contextually feasible, acceptable, and sustainable.Article Citation - WoS: 8Citation - Scopus: 18Featuring Cio: Roles, Skills and Soft Skills(Igi Global, 2013) Cano, Carmen; Fernandez-Sanz, Luis; Misra, SanjayThis paper describes how the CIO (Chief Information Officer) position appears as a key role in the organizations and the requirements for candidates. The authors compare the requirements presented in different studies to know what are the most important skills for a successful performance as a CIO. They stress the importance of non technical skills as key factors for professional performance. The authors have compared soft skills for CIO or equivalent positions and other professional profiles like programmers or analysts using data taken from thousands of job ads. An overview of the most valuable skills (especially soft skills) for CIOS is presented.Article Citation - WoS: 20Citation - Scopus: 28Bug Severity Assessment in Cross Project Context and Identifying Training Candidates(World Scientific Publ Co Pte Ltd, 2017) Singh, V. B.; Misra, Sanjay; Sharma, MeeraThe automatic bug severity prediction will be useful in prioritising the development efforts, allocating resources and bug fixer. It needs historical data on which classifiers can be trained. In the absence of such historical data cross project prediction provides a good solution. In this paper, our objective is to automate the bug severity prediction by using a bug metric summary and to identify best training candidates in cross project context. The text mining technique has been used to extract the summary terms and trained the classifiers using these terms. About 63 training candidates have been designed by combining seven datasets of Eclipse projects to develop the severity prediction models. To deal with the imbalance bug data problem, we employed two approaches of ensemble by using two operators available in RapidMiner: Vote and Bagging. Results show that k-Nearest Neighbour (k-NN) performance is better than the Support Vector Machine (SVM) performance. Naive Bayes f-measure performance is poor, i.e. below 34.25%. In case of k-NN, developing training candidates by combining more than one training datasets helps in improving the performances (f-measure and accuracy). The two ensemble approaches have improved the f-measure performance up to 5% and 10% respectively for the severity levels having less number of bug reports in comparison of major severity level. We have further motivated the paper with a cross project bug severity prediction between Eclipse and Mozilla products. Results show that Mozilla products can be used to build reliable prediction models for Eclipse products and vice versa in case of SVM and k-NN classifiers.Article Citation - WoS: 10A DISCUSSION ON ASSURING SOFTWARE QUALITY IN SMALL AND MEDIUM SOFTWARE ENTERPRISES: AN EMPIRICAL INVESTIGATION(Univ Osijek, Tech Fac, 2011) Pusatli, O. Tolga; Misra, SanjayUnder the studies of general core activities including software inspection, review and testing to achieve quality objectives in small-medium size enterprises (SMEs), the paper presents a contemporary view of such companies against quality measures. The results from a local empirical investigation of quality standards in the Turkish software industry are reported. Around 150 software companies have been approached from which 17 detailed feedback inform that in order to ensure software quality, standards including internationally recognized International Standards Organization (ISO) and Capability Maturity Model Integration (CMMI) are given credit. However the substantial workload and resources required to obtain them are also reported as serious; downscaled frameworks of such large models proposed in the literature are not well known by the SMEs either. The paper also discusses "work around" that bypasses such standards to ease delivery of products while keeping certificates as labels just to acquire new jobs for the business.Article Citation - WoS: 8Citation - Scopus: 10Cobol Systems Migration To Soa: Assessing Antipatterns and Complexity(Kaunas Univ Technology, 2019) Mateos, Cristian; Zunino, Alejandro; Flores, Andres; Misra, SanjaySOA and Web Services allow users to easily expose business functions to build larger distributed systems. However, legacy systems - mostly in COBOL - are left aside unless applying a migration approach. The main approaches are direct and indirect migration. The former implies wrapping COBOL programs with a thin layer of a Web Service oriented language/platform. The latter needs reengineering COBOL functions to a modern language/platform. In our previous work, we presented an intermediate approach based on direct migration where developed Web Services are later refactored to improve the quality of their interfaces. Refactorings mainly capture good practices inherent to indirect migration. For this, antipatterns for WSDL documents (common bad practices) are detected to prevent issues related to WSDLs understanding and discoverability. In this paper, we assess antipatterns of Web Services' WSDL documents generated upon the three migration approaches. In addition, generated Web Services' interfaces are measured in complexity to attend both comprehension and interoperability. We apply a metric suite (by Baski & Misra) to measure complexity on services interfaces - i.e., WSDL documents. Migrations of two real COBOL systems upon the three approaches were assessed on antipatterns evidences and the complexity level of the generated SOA frontiers - a total of 431 WSDL documents.Article Citation - Scopus: 1Optimizing the Stochastic Deployment of Small Base Stations in an Interleave Division Multiple Access-Based Heterogeneous Cellular Networks(Wiley, 2022) Noma-Osaghae, Etinosa; Misra, Sanjay; Koyuncu, MuratThe use of small base stations (SBSs) to improve the throughput of cellular networks gave rise to the advent of heterogeneous cellular networks (HCNs). Still, the interleave division multiple access (IDMA) performance in sleep mode active HCNs has not been studied in the existing literature. This research examines the 24-h throughput, spectral efficiency (SE), and energy efficiency (EE) of an IDMA-based HCN and compares the result with orthogonal frequency division multiple access (OFDMA). An energy-spectral-efficiency (ESE) model of a two-tier HCN was developed. A weighted sum modified particle swarm optimization (PSO) algorithm simultaneously maximized the SE and EE of the IDMA-based HCN. The result obtained showed that the IDMA performs at least 68% better than the OFDMA on the throughput metric. The result also showed that the particle swarm optimization algorithm produced the Pareto optimal front at moderate traffic levels for all varied network parameters of SINR threshold, SBS density, and sleep mode technique. The IDMA-based HCN can improve the throughput, SE, and EE via sleep mode techniques. Still, the combination of network parameters that simultaneously maximize the SE and EE is interference limited. In sleep mode, the performance of the HCN is better if the SBSs can adapt to spatial and temporal variations in network traffic.Article Citation - WoS: 9Citation - Scopus: 14A Comparative Study of Agile, Component-Based, Aspect-Oriented and Mashup Software Development Methods(Univ Osijek, Tech Fac, 2012) Patel, Ahmed; Seyfi, Ali; Taghavi, Mona; Wills, Christopher; Na, Liu; Latih, Rodziah; Misra, Sanjay; Computer EngineeringThis paper compares Agile Methods, Component-Based Software Engineering (CBSE), Aspect-Oriented Software Development (AOSD) and Mashups as the four most advanced software development methods. These different approaches depend almost totally on their application domain but their usability can be equally applied across domains. The purpose of this comparative analysis is to give a succinct and clear review of these four methodologies. Their definitions, characteristics, advantages and disadvantages are considered and a conceptual mind-map is generated that sets out a foundation to assist in the formulation and design of a possible new integrated software development approach. This includes supportive techniques to benefit from the examined methods' potential advantages for cross-fertilization. It is a basis upon which new thinking may be initiated and further research stimulated in the software engineering subject field.Article Citation - WoS: 7Citation - Scopus: 11Software Project Scheduling Using the Hyper-Cube Ant Colony Optimization Algorithm(Univ Osijek, Tech Fac, 2015) Crawford, Broderick; Soto, Ricardo; Johnson, Franklin; Misra, Sanjay; Paredes, Fernando; Olguin, EduardoThis paper introduces a proposal of design of Ant Colony Optimization algorithm paradigm using Hyper-Cube framework to solve the Software Project Scheduling Problem. This NP-hard problem consists in assigning tasks to employees in order to minimize the project duration and its overall cost. This assignment must satisfy the problem constraints and precedence between tasks. The approach presented here employs the Hyper-Cube framework in order to establish an explicitly multidimensional space to control the ant behaviour. This allows us to autonomously handle the exploration of the search space with the aim of reaching encouraging solutions.Article Citation - WoS: 3Citation - Scopus: 5Particle Swarm Optimization of the Spectral and Energy Efficiency of an Scma-Based Heterogeneous Cellular Network(Wiley, 2022) Noma-Osaghae, Etinosa; Misra, Sanjay; Ahuja, Ravin; Koyuncu, MuratBackground The effect of stochastic small base station (SBS) deployment on the energy efficiency (EE) and spectral efficiency (SE) of sparse code multiple access (SCMA)-based heterogeneous cellular networks (HCNs) is still mostly unknown. Aim This research study seeks to provide insight into the interaction between SE and EE in SBS sleep-mode enabled SCMA-based HCNs. Methodology A model that characterizes the energy-spectral-efficiency (ESE) of a two-tier SBS sleep-mode enabled SCMA-based HCN was derived. A multiobjective optimization problem was formulated to maximize the SE and EE of the SCMA-based HCN simultaneously. The multiobjective optimization problem was solved using a proposed weighted sum modified particle swarm optimization algorithm (PSO). A comparison was made between the performance of the proposed weighted sum modified PSO algorithm and the genetic algorithm (GA) and the case where the SCMA-based HCN is unoptimized. Results The Pareto-optimal front generated showed a simultaneous maximization of the SE and EE of the SCMA-based HCN at high traffic levels and a convex front that allows network operators to select the SE-EE tradeoff at low traffic levels flexibly. The proposed PSO algorithm offers a higher SBS density, and a higher SBS transmit power at high traffic levels than at low traffic levels. The unoptimized SCMA-based HCN achieves an 80% lower SE and a 51% lower EE than the proposed PSO optimized SCMA-based HCN. The optimum SE and EE achieved by the SCMA-based HCN using the proposed PSO algorithm or the GA are comparable, but the proposed PSO uses a 51.85% lower SBS density and a 35.96% lower SBS transmit power to achieve the optimal SE and EE at moderate traffic levels. Conclusion In sleep-mode enabled SCMA-based HCNs, network engineers have to decide the balance of SBS density and SBS transmit power that helps achieve the desired SE and EE.Article Citation - WoS: 17Citation - Scopus: 26Toward Ontology-Based Risk Management Framework for Software Projects: an Empirical Study(Wiley, 2020) Abioye, Temitope Elizabeth; Arogundade, Oluwasefunmi Tale; Misra, Sanjay; Akinwale, Adio T.; Adeniran, Olusola JohnSoftware risk management is a proactive decision-making practice with processes, methods, and tools for managing risks in a software project. Many existing techniques for software project risk management are textual documentation with varying perspectives that are nonreusable and cannot be shared. In this paper, a life-cycle approach to ontology-based risk management framework for software projects is presented. A dataset from literature, domain experts, and practitioners is used. The identified risks are refined by 19 software experts; risks are conceptualized, modeled, and developed using Protege. The risks are qualitatively analyzed and prioritized, and aversion methods are provided. The framework is adopted in real-life software projects. Precision recall and F-measure metrics are used to validate the performance of the extraction tool while performance and perception evaluation are carried out using the performance appraisal form and technology acceptance model, respectively. Mean scores from performance and perception evaluation are compared with evaluation concept scale. Results showed that cost is reduced, high-quality projects are delivered on time, and software developers found this framework a potent tool needed for their day-to-day activities in software development.

