Search published articles


Showing 59 results for Analysis

Laya Olfat, Maghsoud Amiri, Jjahanyar Bamdad Soofi, Mostafa Ebrahimpour Azbari,
Volume 25, Issue 2 (5-2014)
Abstract

Having a comprehensive evaluation model with reliable data is useful to improve performance of supply chain. In this paper, according to the nature of supply chain, a model is presented that able to evaluate the performance of the supply chain by a network data envelopment analysis model and by using the financial, intellectual capital (knowledge base), collaboration and responsiveness factors of the supply chain. At the first step, indicators were determined and explained by explanatory Factor Analysis. Then, Network Data Envelopment Analysis (NDEA) model was used. This paper is the result of research related to supply chain of pharmaceutical companies in Tehran Stock Exchange and 115 experts and senior executives have been questioned as sample. The results showed that responsiveness latent variable had the highest correlation with supply chain performance and collaborative, financial and intellectual capital (knowledge base) latent variables were respectively after that. Four of the twenty eight supply chains which were studied obtained 1 as the highest performance rate and the lowest observed performance was 0.43.
Amin Parvaneh, Mohammadjafar Tarokh, Hossein Abbasimehr,
Volume 25, Issue 3 (7-2014)
Abstract

Data mining is a powerful tool for firms to extract knowledge from their customers’ transaction data. One of the useful applications of data mining is segmentation. Segmentation is an effective tool for managers to make right marketing strategies for right customer segments. In this study we have segmented retailers of a hygienic manufacture. Nowadays all manufactures do understand that for staying in the competitive market, they should set up an effective relationship with their retailers. We have proposed a LRFMP (relationship Length, Recency, Frequency, Monetary, and Potential) model for retailer segmentation. Ten retailer clusters have been obtained by applying K-means algorithm with K-optimum according Davies-Bouldin index on LRFMP variables. We have analyzed obtained clusters by weighted sum of LRFMP values, which the weight of each variable calculated by Analytic Hierarchy Process (AHP) technique. In addition we have analyzed each cluster in order to formulate segment-specific marketing actions for retailers. The results of this research can help marketing managers to gain deep insights about retailers.
Seyed Mojtaba Jafari Henjani, Valeriy Severin,
Volume 25, Issue 3 (7-2014)
Abstract

The paper is devoted to solution of some problems in nuclear power station generating unit intellectual control systems using genetic algorithms on the basis of control system model development, optimizations methods of their direct quality indices and improved integral quadratic estimates. Some mathematical vector models were obtained for control system multicriterion quality indices with due consideration of stability and quality indices criteria, this increasing the reliability of optimal control system synthesis. Optimal control systems with fuzzy controllers were synthesized for nuclear reactor, steam generator and steam turbine, thus allowing comparison between fuzzy controllers and traditional PID controllers. Mathematical models built for nuclear power station generating unit control systems, including nuclear reactor, steam generator, steam turbine and their control systems interacting under normal operational modes, which permitted to perform parametrical synthesis of system and to study various power unit control laws. On the basis of power unit control system models controllers were synthesized for normal operational modes.
Mr Sachin Mahakalkar, Dr Vivek Tatwawadi, Mr Jayant Giri, Dr Jayant Modak,
Volume 26, Issue 1 (3-2015)
Abstract

Response surface methodology (RSM) is a statistical method useful in the modeling and analysis of problems in which the response variable receives the influence of several independent variables, in order to determine which are the conditions under which should operate these variables to optimize a corrugated box production process. The purpose of this research is to create response surface models through regression on experimental data which has been reduced using DA to obtain optimal processing conditions. Studies carried out for corrugated sheet box manufacturing industries having man machine system revealed the contribution of many independent parameters on cycle time. The independent parameters include anthropometric data of workers, personal data, machine specification, workplace parameters, product specification, environmental conditions and mechanical properties of corrugated sheet. Their effect on response parameter cycle time is totally unknown. The developed model was simulated and optimized with the aid of MATLAB R2011a and the computed value for cycle time is obtained and compared with experimental value. The results obtained showed that the correlation R, adjusted R2 and RMS error were valid.
Mr Aliakbar Hasani, Mr Seyed Hessameddin Zegordi,
Volume 26, Issue 1 (3-2015)
Abstract

In this study, an optimization model is proposed to design a Global Supply Chain (GSC) for a medical device manufacturer under disruption in the presence of pre-existing competitors and price inelasticity of demand. Therefore, static competition between the distributors’ facilities to more efficiently gain a further share in market of Economic Cooperation Organization trade agreement (ECOTA) is considered. This competition condition is affected by disruption occurrence. The aim of the proposed model is to maximize the expected net after-tax profit of GSC under disruption and normal situation at the same time. To effectively deal with disruption, some practical strategies are adopted in the design of GSC network. The uncertainty of the business environment is modeled using the robust optimization technique based on the concept of uncertainty budget. To tackle the proposed Mixed-Integer Nonlinear Programming (MINLP) model, a hybrid Taguchi-based Memetic Algorithm (MA) with an adaptive population size is developed that incorporates a customized Adaptive Large Neighborhood Search (ALNS) as its local search heuristic. A fitness landscape analysis is used to improve the systematic procedure of neighborhood selection in the proposed ALNS. A numerical example and computational results illustrate the efficiency of the proposed model and algorithm in dealing with global disruptions under uncertainty and competition pressure.
Mahdi Karbasian, Mohammad Farahmand, Mohammad Ziaei,
Volume 26, Issue 2 (7-2015)
Abstract

This research aims at presenting a consolidated model of data envelopment analysis (DEA) technique and value engineering to select the best manufacturing methods for gate valve covers and ranking the methods using TOPSIS.To do so, efficiency evaluation indices were selected based on the value engineering approach and different manufacturing methods were evaluated using DEA technique.Finally, effective methods were ranked based on TOPSIS. Accordingly, 48 different methods were identified for manufacturing the part. The DEA results showed that only 12 methods have complete efficiency. Meanwhile manufacturing method No. 32 (A216 WCB casting purchased from Chinese market as the raw material, machining by CNC+NC and drilling by radial drill) was ranked the first.Major limitations of the research include time limitations, place limitation, lack of access to the standards adaptability index in different machining and drilling methods, limitation on evaluating all parts of a product, limitation on a technique evaluating efficiency and ranking, and mere satisfying with superior indices in each factor of value engineering. Most previous studies only evaluated efficiency of manufacturing methods based on a single approach.By applying value engineering, which is in fact a combination of three approaches (including quality approach, functional, and cost approaches), the present research provided a far more comprehensive model to evaluate manufacturing methods in industrial.

\"AWT


Dr. Mustafa Jahnagoshai Rezaee, Dr. Alireza Moini,
Volume 26, Issue 4 (11-2015)
Abstract

Data envelopment analysis (DEA) and balanced scorecard (BSC) are two well-known approaches for measuring performance of decision making units (DMUs). BSC is especially applied with quality measures, whereas, when the quantity measures are used to evaluate, DEA is more appropriate. In the real-world, DMUs usually have complex structures such as network structures. One of the well-known network structures is two-stage processes with intermediate measures. In this structure, there are two stages and each stage uses inputs to produce outputs separately where the first stage outputs are inputs for the second stage. This paper deals with integrated DEA and game theory approaches for evaluating two-stage processes. In addition, it is an extension of DEA model based on BSC perspectives. BSC is used to categorize the efficiency measures under two-stage process. Furthermore, we propose a two-stage DEA model with considering leader-follower structure and including multiple sub stages in the follower stage. To determine importance of each category of measures in a competitive environment, cooperative and non-cooperative game approaches are used. A case study for measuring performance of power plants in Iran is presented to show the abilities of the proposed approach.

\"AWT


Aghil Hamidihesarsorkh, Ali Papi, Ali Bonyadi Naeini, Armin Jabarzadeh,
Volume 28, Issue 1 (3-2017)
Abstract

Nowadays, the popularity of social networks as marketing tools has brought a deal of attention to social networks analysis (SNA). One of the well-known Problems in this field is influence maximization problems which related to flow of information within networks. Although, the problem have been considered by many researchers, the concept behind of this problem has been used less in business context. In this paper, by using a cost-benefits analysis, we propose a multi-objective optimization model which helps to identify the key nodes location, which are a symbol of potential influential customers in real social networks. The main novelty of this model is that it determines the best nodes by combining two essential and realistic elements simultaneously: diffusion speed and dispersion cost. Also, the performance of the proposed model is validated by detecting key nodes on a real social network


Abdollah Eshghi, Mehrdad Kargari,
Volume 29, Issue 1 (3-2018)
Abstract

In this paper a fraud detection method is proposed which user behaviors are modeled using two main components namely the un-normal trend analysis component and scenario based component. The extent of deviation of a transaction from his/her normal behavior is estimated using fuzzy membership functions. The results of applying all membership functions on a transaction will then be infused and a final risk is gained which is the basis for decision making in order to block the arrived transaction or not. An optimized threshold for the value of the final risk is estimated in order to make a balance between the fraud detection rate and alarm rate. Although the assessment of such problems are complicated, we show that this method can be useful in application according to several measures and metrics.
Mojtaba Hamid, Mahdi Hamid, Mohammad Mahdi Nasiri, Mahdi Ebrahimnia,
Volume 29, Issue 2 (6-2018)
Abstract

Surgical theater is one of the most expensive hospital sources that a high percentage of hospital admissions are related to it. Therefore, efficient planning and scheduling of the operating rooms (ORs) is necessary to improve the efficiency of any healthcare system. Therefore, in this paper, the weekly OR planning and scheduling problem is addressed to minimize the waiting time of elective patients, overutilization and underutilization costs of ORs and the total completion time of surgeries. We take into account the available hours of ORs and the surgeons, legal constraints and job qualification of surgeons, and priority of patients in the model. A real-life example is provided to demonstrate the effectiveness and applicability of the model and is solved using ε-constraint method in GAMS software. Then, data envelopment analysis (DEA) is employed to obtain the best solution among the Pareto solutions obtained by ε-constraint method. Finally, the best Pareto solution is compared to the schedule used in the hospitals. The results indicate the best Pareto solution outperforms the schedule offered by the OR director.
Arezoo Jahani, Parastoo Mohammadi, Hamid Mashreghi,
Volume 29, Issue 2 (6-2018)
Abstract

Innovation & Prosperity Fund (IPfund) in Iran as a governmental organization aims to develop new technology-based firms (NTBF) by its available resources through financing these firms. The innovative projects which refer to IPfund for financing are in a stage which can receive both fixed rate facilities and partnership in the projects, i.e. profit loss sharing (PLS). Since this fund must protect its initial and real value of its capital against inflation rate, therefore, this study aims to examine the suitable financing methods with considering risk. For this purpose we study on risk assessment models to see how to use risk adjusted net present value for knowledge based projects. On this basis, the NPV of a project has been analyzed by taking into account the risk variables (sales revenue and the cost of fixed investment) and using Monte Carlo simulation. The results indicate that in most cases for a project, the risk adjusted NPV in partnership scenario is more than the other scenario. In addition to, partnership in projects which demand for industrial production facilities is preferable for the IPfund than projects calling for working capital.
Ali Vaysi, Abbas Rohani, Mohammad Tabasizadeh, Rasool Khodabakhshian, Farhad Kolahan,
Volume 29, Issue 3 (9-2018)
Abstract

Nowadays, the CNC machining industry uses FMEA approach to improve performance, reduce component failure, and downtime of the machines. FMEA method is one of the most useful approach for the maintenance scheduling and consequently improvement of the reliability. This paper presents an approach to prioritize and assessment the failures of electrical and control components of CNC lathe machine. In this method, the electrical and control components were analyzed independently for every failure mode according to RPN. The results showed that the conventional method by means of a weighted average, generates different RPN values ​​for the subsystems subjected to the study. The best result for Fuzzy FMEA obtained for the 10-scale and centroid defuzzification method. The Fuzzy FMEA sensitivity analysis showed that the subsystem risk level is dependent on O, S, and D indices, respectively. The result of the risk clustering showed that the failure modes can be clustered into three risk groups and a similar maintenance policy can be adopted for all failure modes placed in a cluster. Also, The prioritization of risks could also help the maintenance team to choose corrective actions consciously. In conclusion, the Fuzzy FMEA method was found to be suitably adopted in the CNC machining industry. Finally, this method helped to increase the level of confidence on CNC lathe machine.
Hamiden Abd Elwahed Khalifa, El- Saed Ebrahim Ammar,
Volume 30, Issue 1 (3-2019)
Abstract

     Fully fuzzy linear programming is applied to water resources management due to its close connection with human life, which is considered to be of great importance. This paper investigates the decision-making concerning water resources management under uncertainty based on two-stage stochastic fuzzy linear programming. A solution method for solving the problem with fuzziness in relations is suggested to prove its applicability. The purpose of the method is to generate a set of solutions for water resources planning that helps the decision-maker make a tradeoff between economic efficiency and risk violation of the constraints. Finally, a numerical example is given and is approached by the proposed method.
 
Mangesh Phate, Shraddha Toney, Vikas Phate,
Volume 30, Issue 1 (3-2019)
Abstract

In the Wire EDM of oil hardening die steel materials is a complicated machining process. Hence to find out the best set of process parameters is an important step in wire EDM process. Multi-response optimization of machining parameters was done by using analysis called desirability function analysis coupled with the dimensional analysis approach. In the present work, based on Taguchi’s L27 orthogonal array, number experiments were conducted for OHNS material. The WEDM process parameters such as, pulse on time , pulse off time, input current, wire feed rate and the servo voltage are optimized by multi-response considerations such as material removal rate and surface roughness. Based on desirability analysis, the most favorable levels of parameters have been known. The significant contribution of parameters is determined by dimensional analysis. The experimental results show that the results obtain by using DA approach has a good agreement with the measured responses. The correlation up to  99% has been achieved between the developed model and the measured responses by using dimensional analysis approach. 
In the Wire EDM of oil hardening die steel materials is a complicated machining process. Hence to find out the best set of process parameters is an important step in the wire EDM process. Multi-response optimization of machining parameters was done by using analysis called desirability function analysis coupled with the dimensional analysis approach. In the present work, based on Taguchi’s L27 orthogonal array, number experiments were conducted for OHNS material. The WEDM process parameters such as pulse on time, pulse off time, input current, wire feed rate, and the servo voltage are optimized by multi-response considerations such as material removal rate and surface roughness. Based on desirability analysis, the most favorable levels of parameters have been known. The significant contribution of parameters is determined by dimensional analysis. The experimental results show that the results obtain by using DA approach has a good agreement with the measured responses. The correlation up to  99% has been achieved between the developed model and the measured responses by using dimensional analysis approach. 
Hassan Rashidi, Fereshteh Azadi Parand,
Volume 30, Issue 3 (9-2019)
Abstract

One of the modern paradigms to develop a system is object oriented analysis and design. In this paradigm, there are several objects and each object plays some specific roles. There is a sequence of activities to develop an analysis model. In the first step, we work in developing an initial use case model. Then in the second step, they identify a number of concepts and build a glossary of participating objects.  Identifying attributes of objects (and classes) is one of the most important steps in the object-oriented paradigm. This paper proposes a method to identify attributes of objects and verify them. The method is also concerned itself with classifying and eliminating the incorrect attributes of objects. Then the method is evaluated in a large application, a Control Command Police System. After that, several guidelines on attributes of objects, based on the practical experience obtained from the evaluation, are provided.
Saadat Ali Rizvi, Ali Wajahat ,
Volume 30, Issue 3 (9-2019)
Abstract

CNC turning is widely used as a manufacturing process through which unwanted material is removed to get the high degree of surface rough. In this research article, Taguchi technique was coupled with grey relation analysis (GRA) to optimize the turning parameters for simultaneous improvement of productivity, average surface roughness (Ra), and root mean square roughness (Rq).Taguchi technique L27 (34) orthogonal array was used in this experimental work. Feed, speed, and depth of cut were considered as the controllable process parameters. average roughness (Ra), root mean square roughness (Rq),and material removal rate (MRR) were considered as the performance characteristic and from TGRA result, it was revealed that the optimum combinational parameters for multi-performance, based on mean response values and confirmation experiments with Taguchi-based GRA is A1B1C1 (Vc=400 rpm, f=0.06 mm/rev, and DOC=0.5 mm). The optimum values obtained from experimental investigations for Ra was 6.86 μm, and MRR was 20690.31 mm3/s,further analysis of variance(ANOVA) were applied and it was identified that the depth of cut having most significant effect followed by speed and feed for multiresponse optimization. The percentage contribution of depth of cut was 38.28.71 %, speed was 11.89 % and feed was 8.466 %.
CNC turning is widely used as a manufacturing process through which unwanted material is removed to get a high degree of surface roughness. In this research article, Taguchi technique was coupled with grey relation analysis (GRA) to optimize the turning parameters for simultaneous improvement of productivity, the average surface roughness (Ra), and root means square roughness (Rq). Taguchi technique L27 (34) orthogonal array was used in this experimental work. Feed, speed, and depth of cut were considered as the controllable process parameters. average roughness (Ra), root mean square roughness (Rq), and material removal rate (MRR) were considered as the performance characteristic and from TGRA result, it was revealed that the optimum combinational parameters for multi-performance, based on mean response values and confirmation experiments with Taguchi-based GRA is A1B1C1 (Vc=400 rpm, f=0.06 mm/rev, and DOC=0.5 mm). The optimum values obtained from experimental investigations for Ra was 6.86 μm, and MRR was 20690.31 mm3/s, further analysis of variance(ANOVA) were applied and it was identified that the depth of cut having most significant effect followed by speed and feed for multiresponse optimization. The percentage contribution of the depth of cut was 38.28.71 %, speed was 11.89 % and feed was 8.466 %.
Mehrdad Jalali Sepehr, Abdorrahman Haeri, Rouzbeh Ghousi,
Volume 30, Issue 4 (12-2019)
Abstract

Abstract
Background: In this paper healthcare condition of 31 countries that are the members of Organization for Economic and Co-operative Development (OECD) is measured by considering 14 indicators that are relevant to three main pillars of sustainable development.
Method: To estimate the efficiency scores, Principle Component Analysis-Data Envelopment Analysis PCA-DEA additive model in both forms of envelopment and multiplier is used to determine efficiency scores and also to define benchmarks and improvement plan for the inefficient countries. Then Decision Tree Analysis is also used to realize that which factors were the most influential ones to make a county an efficient Decision Making Unit (DMU).
Results: According to the PCA-DEA additive model, among 31 OECD countries, 16 countries have become inefficient, that USA have taken the lowest efficiency score, and among efficient countries Iceland could be considered as a paragon which has the highest frequency between the countries that are defined as the benchmarks. Decision tree analysis also show that exposure to PM2.5 is an influential factor on the efficiency status of countries.
Conclusion: This research gives an insight about the sustainable development and healthcare system and show the impressive effect of environmental and social factors like: exposure to PM2.5 and water quality, population insurance coverage, and AIDS on the healthcare efficiency of OECD countries
Mangesh Phate, Shraddha Toney, Vikas Phate,
Volume 31, Issue 2 (6-2020)
Abstract

In the present work, a model based on dimensional analysis (DA) coupled with the Taguchi method to analyze the impact of silicon carbide (SiC) has been presented. The wire cut electrical discharge machining (WEDM) performance of aluminium silicon carbide (AlSiC) metal matrix composite (MMC) has been critically examined. To formulate the DA based models, total 18 experiments were conducted using Taguchi’s L18 mixed plan of experimentation. The input data used in the DA models are a pulse on time, pulse off time, wire feed rate, % SiC, wire tension, flushing pressure etc. According to these process parameters, DA models for the surface roughness and the material removal rate was predicted. The formulated DA models have shown a strong correlation with the experimental data. The analysis of variance (ANOVA) has been used to find out the impact of individual parameters on response parameters.  
 
P Subbaraju, K. Chandra Sekhar, P.r.s.s.v. Raju, K. Satyanarayana Raju, M. K. S. Varma,
Volume 31, Issue 2 (6-2020)
Abstract

Nowadays, data are the food for the digital world. The main rich sources for data generated by social networks such as Twitter, Face book, Instagram, and LinkedIn. The data generated from micro-blogging services are plays a vital role in business intelligence like product reviews, movie reviews, election results prediction by social media data analysis. Sentiment analysis (SA) is the key method of predicting netizens emotions behind the text expressed in social media. The main motto of this survey to give complete idea about tools and the techniques used in Sentiment Analysis and the relevant fields with brief details.
K.v.k Sasikanth, K. Samatha, N. Deshai, B. V. D. S. Sekhar, S. Venkatramana,
Volume 31, Issue 3 (9-2020)
Abstract

The Today’s interconnected world generates huge digital data, while millions of users share their opinions, feelings on various topics through popular applications such as social media, different micro blogging sites, and various review sites on every day. Nowadays Sentiment Analysis on Twitter Data which is considered as a very important problem particularly for various organizations or companies who want to know the customers feelings and opinions about their products and services. Because of the data nature, variety and enormous size, it is very practical for several applications, range from choice and decision creation to product assessment. Tweets are being used to convey the sentiment of a tweeter on a specific topic. Those companies keeping survey millions of tweets on some kind of subjects to evaluate actual opinion and to know the customer feelings. This paper major goal would be to significantly collect, recognize, filter, reduce and analyze all such relevant opinions, emotions, and feelings of people on different product or service could be categorized into positive, negative or neutral because such categorization improves sales growth about a company's products or films, etc. We initiate that the Naïve Bayes classifier be the mainly utilized machine learning method for mining feelings from large data like twitter and popular social network because of its more accuracy rates. In this paper, we scrutinize sentiment polarity analysis on Twitter data in a distributed environment, known as Apache Spark.

Page 2 from 3