Search published articles


Showing 59 results for Analysis

Erni Puspanantasari Putri, Erwin Widodo, Jaka Purnama, Bonifacius Raditya Sri Pramana Putra, Agatha Hannabel Avnanta Puteri,
Volume 0, Issue 0 (10-2024)
Abstract

Micro- and small-scale industries (MSIs) are the pillars of Indonesia’s national economy. MSIs face several issues as their businesses grow. Performance evaluation is one way to identify MSI’s effectiveness. The research objective is to evaluate the MSI’s performance in East Java Province, Indonesia. It is an effort to improve the MSI's performance. The stepwise modeling approach (SMA) and data envelopment analysis (DEA) methods were applied to identify MSIs' effectiveness, determine the classification of inefficient MSIs, and formulate an inefficient MSI development strategy. In the existing SMA concept, the remaining variables in the END step are the selected variables (model X-Y). This study proposes that variables from the initial step to step n+1 are considered in creating efficiency score models. There are five proposed models, including model 4X-3Y, model 3X-3Y, model 3X-2Y, model 2X-2Y, and model 2X-Y. The research result indicated that the proposed ES model 3X-3Y is the best. 54% inefficient and 46% efficient DMUs make up the model 3X-3Y. Six cities and fourteen regencies make up the inefficient SMI classification. Cluster_A (50%) consists of four cities and six regencies. Cluster_B (25%) consists of two cities and three regencies. Cluster_C contains two regencies (10%). Cluster_D comprises three regencies (15%).

 
M.r. Alirezaee, S.a Mir-Hassani,
Volume 17, Issue 4 (11-2006)
Abstract

In the evaluation of non-efficient units by Data Envelopment Analysis (DEA) referenced Decision Making Units (DMU’s) have an important role. Unfortunately DMU’s with extra ordinary output can lead to a monopoly in a reference set, the fact called abnormality due to the outliers' data. In this paper, we introduce a DEA model for evaluating DMU’s under this circumstance. The layer model can result in a ranking for DMU’s and obtain an improving strategy leading to a better layer.


R. Farnoosh, B. Zarpak ,
Volume 19, Issue 1 (3-2008)
Abstract

Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm.

  In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact, a new numerically method was introduced for finding the maximum a posterior estimation by using EM-algorithm and Gaussians mixture distribution. In this algorithm, we were made a sequence of priors, posteriors were made and then converged to a posterior probability that is called the reference posterior probability. Maximum a posterior estimated can determine by the reference posterior probability which can make labeled image. This labeled image shows our segmented image with reduced noises. We presented this method in several experiments.


Rahman Farnoosh, Behnam Zarpak,
Volume 19, Issue 1 (3-2008)
Abstract

  Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we have learned Gaussian mixture model to the pixels of an image. The parameters of the model have estimated by EM-algorithm.

  In addition pixel labeling corresponded to each pixel of true image is made by Bayes rule. In fact, we introduce a new numerically method of finding maximum a posterior estimation by using EM-algorithm and Gaussians mixture distribution. In this algorithm, we have made a sequence of priors, posteriors and they converge to a posterior probability that is called the reference posterior probability. Maximum a posterior estimated can determine by the reference posterior probability that will make labeled image. This labeled image shows our segmented image with reduced noises. We show this method in several experiments.


B. Moetakef Imani, Kazemi Nasrabadi , Kazemi Sadeghi ,
Volume 19, Issue 7 (8-2008)
Abstract

The stability behavior of low immersion helical end milling processes is investigated in this paper. Low radial immersion milling operations involve interrupted cutting which induces chatter vibration under certain cutting conditions. Time Finite Element Analysis (TFEA) is suggested for an approximate solution for delayed differential equations encountered during interrupted milling. An improved TFEA is proposed which includes the effects of helix angle variations on cutting force, cutting time and specific cutting force coefficients. For this purpose, five different cases were distinguished for engagement limits of the cutting edges. It has been observed that an increase in the helix angle improves the stability limit of the process. This is related to the flip bifurcation lobes that start to separate from the main lobes and shape isolated unstable islands. By further increasing the helix angle, unstable islands will vanish .


M.h. Sadegh, S. Jafari , B. Nasseroleslami ,
Volume 19, Issue 7 (8-2008)
Abstract

Modal parameter extraction of high speed shafts is of critical importance in mechanical design of turbo-pumps. Due to the complex geometry and peripheral components of turbo-pumps, difficulties can arise in determination of modal parameters. In this study, modal properties of a turbo-pump shaft, was studied by experimental modal analysis, and using different excitation techniques. An innovative suspending method is proposed to reduce noise-to-signal ratio, resulting from classic suspensions. Comparison of the experimental results obtained from the proposed suspension method and the traditional ones shows that the proposed approach was a promising method, when classic methods fall short of expectations in analysis of complex structures. To validate the experimental results, numerical solution was carried out using simplified geometric modeling combined with the Finite Element Method. The simplified modeling approach can be considered as a reliable theoretical method for numerical modal analysis of similar structures. Comparison of experimental and numerical results shows that there is a good conformity between the results of two approaches . 


S.m. Mohammad Seyedhoseini , M. Ali Hatefi,
Volume 20, Issue 1 (5-2009)
Abstract

  Selecting an effective project plan is a significant area in the project management. The present paper introduces a technique to identify the project plan efficient frontier for assessing the alternative project plans and selecting the best plan. The efficient frontier includes two criteria: the project cost and the project time. Besides, the paper presents a scheme to incorporate Directed Acyclic Graph (DAG) into the project risk analysis.

This scheme is used to estimate the expected impacts of the occurrence of the project risks on the project cost and the project time. Also, a theoretical model is defined to provide integration between project risk analysis and overall project planning using the breakdown structures. We believe that applying the proposed technique helps the company’s managers in most effective manner dealing with his complicated project plan assessment and selection problems. The application of the technique was implemented in the companies in construction industry in which represented a considerable cost and time improvements.
Jafar Mahmudi, Soroosh Nalchigar , Seyed Babak Ebrahimi,
Volume 20, Issue 1 (5-2009)
Abstract

Selection of an appropriate set of Information System (IS) projects is a critical business activity which is very helpful to all organizations. In this paper, after describing real IS project selection problem of Iran Ministry of Commerce (MOC), we introduce two Data Envelopment Analysis (DEA) models. Then, we show applicability of introduced models for identifying most efficient IS project from 8 competing projects. Then, in order to provide further insight, results of two introduced models are compared. It is notable that using basic DEA models -CCR and BCC- decision maker is not able to find most efficient Decision Making Unit (DMU) since these models identify some of DMUs as efficient which their efficiency scores equal to 1. As an advantage, the applied models can identify most efficient IS (in constant and variable return to scale situations) by solving only one linear programming (LP). So these models are computationally efficient. It is while using the basic DEA models requires decision maker to solve a LP for each IS.
Peyman Akhavan, Reza Hosnavi , Sanjaghi Mohammad ,
Volume 20, Issue 3 (9-2009)
Abstract

This paper is to develop a knowledge management (KM) model in some Iranian academic research centers (ARC) based on KM critical success factors. General KM critical success factors (CSF) were identified through literature review. Then the research procedure led to the identification of KM critical success factors in Iranian ARCs including 16 different factors. It was done through first stage survey by about 300 sample targets. Then, these 16 factors were surveyed separately again by experts through a Delphi panel. The experts suggested their practical solutions for exploiting the 16 factors in ARCs through a KM framework based on a KM cycle. This 2 years research has been done during 2006 to 2008.
Ali Habibi Badrabadi , Mohammad Jafar Tarokh,
Volume 20, Issue 3 (9-2009)
Abstract

Service Oriented Enterprises (SOEs) are subject to constant change and variation. In this paper, the changes are considered from an economic perspective based on service culture notion. Once a change is implemented, the costs of some member services may increase, whereas the costs of some other services may reduce. We construct a game theoretic model trying to capture the possible conflicting interests of different parties in a SOE. Three incentive mechanisms are applied to the model. The first incentive mechanism shares the utility equally among the services involved in the change the second utility-sharing rule is based on the Nash’s bargaining solution, which accommodates the possible biased interdependencies inside the network and the third rule, based on the Harsanyi’s modified Shapley value, takes into account the possible coalition formation among the network parties. Since the three rules are analytically solvable, the principles of utility sharing can be implemented, for instance, as ex-ante contracts.
Reza Kazemzadeh, Ali Reaziat,
Volume 20, Issue 4 (4-2010)
Abstract

  In today’s extremely competitive markets it is crucial for companies to strategically position their brands, products and services relative to their competitors. With the emerging trend in internationalization of companies especially SME’s and the growing use of the Internet with this regard, great amount of attention has been turned to effective involvement of the Internet channel in the marketing mix of the companies. This has introduced a new term of market space (the Web) versus the traditional battleground of marketplace in which companies compete with each other. The growth of presence in the market space has been exponential, both in general and within specific industries.

  Thus bringing to attention the importance of Web presence and that it is crucial for companies to strategically regard competition in market space. It is important to understand that positioning on the Net is very different and requires its own set of strategies as part of the new marketing paradigm. This study goes towards addressing the need to understand and measure the nature of positioning of company Web sites on the Internet. The aim of the study is to introduce a statistical technique to compare the positioning of Web sites, in and across industries.

  With this regard a group of Web sites from the home appliances manufacturing industry was selected and the technique of correspondence analysis was applied to produce maps which can be studied and interpreted.

The results indicated that either based on company strategies or accidentally, these Web sites are positioned differently and may follow or affect different marketing policies of their owners. At the end, the implications of this technique for management and how it can be used by new home appliance manufacturers or those who want to compare their sites with the ones of their competitors, in order to benchmark and/or revise their policies and strategies have been discussed .
R. Tavakkoli-Moghaddam, S. Mahmoodi,
Volume 21, Issue 2 (5-2010)
Abstract

  A data envelopment analysis (DEA) method can be regarded as a useful management tool to evaluate decision making units (DMUs) using multiple inputs and outputs. In some cases, we face with imprecise inputs and outputs, such as fuzzy or interval data, so the efficiency of DMUs will not be exact. Most researchers have been interested in getting efficiency and ranking DMUs recently. Models of the traditional DEA cannot provide a completely ranking of efficient units however, it can just distinguish between efficient and inefficient units. In this paper, the efficiency scores of DMUs are computed by a fuzzy CCR model and the fuzzy entropy of DMUs. Then these units are ranked and compared with two foregoing procedures. To do this, the fuzzy entropy based on common set of weights (CSW) is used. Furthermore, the fuzzy efficiency of DMUs considering the optimistic level is computed. Finally, a numerical example taken from a real-case study is considered and the related concept is analyzed.


A. Doostparast Torshizi, S.r. Hejazi,
Volume 21, Issue 2 (5-2010)
Abstract

In highly competitive industrial market, the concept of failure analysis is an unavoidable fact in complex industrial systems. Reliability of such systems not only depends on the reliability of each element of these systems, but also depends on occurrence of sequence of failures. In this paper, a novel approach to sequential failure analysis is proposed which is based upon fuzzy logic and the concept of Petri nets which is utilized to track all the risky behaviors of the system and to determine the potential failure sequences and then prioritizing them in order to perform corrective actions. The process of prioritizing failure sequences in this paper is done by a novel similarity measure between generalized fuzzy numbers. The proposed methodology is demonstrated with an example of two automated machine tools and two input/output buffer stocks.
Saeed Ramezani , Azizollah Memariani,
Volume 22, Issue 2 (6-2011)
Abstract

 

  Condition Monitoring,

  Oil Analysis, Wear Behavior,

  Fuzzy Rule Based System

 

Maintenance , as a support function, plays an important role in manufacturing companies and operational organizations. In this paper, fuzzy rules used to interpret linguistic variables for determination of priorities. Using this approach, such verbal expressions, which cannot be explicitly analyzed or statistically expressed, are herein quantified and used in decision making.

In this research, it is intended to justify the importance of historic data in oil analysis for fault detection. Initial rules derived by decision trees and visualization then these fault diagnosis rules corrected by experts. With the access to decent information sources, the wear behaviors of diesel engines are studied. Also, the relation between the final status of engine and selected features in oil analysis is analyzed. The dissertation and analysis of determining effective features in condition monitoring of equipments and their contribution, is the issue that has been studied through a Data Mining model.
Gholam Reza Jalali Naieni, Ahmad Makui, Rouzbeh Ghousi,
Volume 23, Issue 1 (3-2012)
Abstract

Fuzzy Logic is one of the concepts that has created different scientific attitudes by entering into various professional fields nowadays and in some cases has made remarkable effects on the results of the practical researches. However, the existence of stochastic and uncertain situations in risk and accident field, affects the possibility of the forecasting and preventing the occurrence of the accident and the undesired results of it.

In this paper, fuzzy approach is used for risk evaluating and forecasting, in accidents caused by working with vehicles such as lift truck. Basically, by using fuzzy rules in forecasting various accident scenarios, considering all input variables of research problem, the uncertainty space in the research subject is reduced to the possible minimum state and a better capability of accident forecasting is created in comparison to the classic two-valued situations. This new approach helps the senior managers make decisions in risk and accident management with stronger scientific support and more reliably.


, ,
Volume 23, Issue 2 (6-2012)
Abstract

A practical application of the spherical involute surface to the forged straight bevel gears is provided and demonstrated in this work. Conjugate (pure involute) theoretical surfaces are developed from the input design parameters. The surfaces are modified to suit the actual application (automotive differential). The unloaded (or low load) tooth contact analysis of modified surfaces is performed to obtain the prediction of the contact pattern. In order to verify the procedure and predictions, actual straight bevel gears are forged by using provided surfaces, and their contact pattern is compared to the predictions. Influence of the misalignments on the gear performance is investigated in order to provide more robust design.
Seyed Omid Hasanpour Jesri, Abbas Ahmadi, Behrooz Karimi, Mohsen Akbarpour ,
Volume 23, Issue 4 (11-2012)
Abstract

One of the most important issues in urban planning is developing sustainable public transportation. The basic condition for this purpose is analyzing current condition especially based on data. Data mining is a set of new techniques that are beyond statistical data analyzing. Clustering techniques is a subset of it that one of it’s techniques used for analyzing passengers’ trip. The result of this research shows relations and similarities in different segments that its usage is from strategic to tactical and operational areas. The approach in transportation is completely novel in the part of trip patterns and a novel process is proposed that can be implemented in highway analysis. Also this method can be applied in traffic and vehicle treats that need automatic number plate recognition (ANPR) for data gathering. A real case study has been studied here by developed process.
Yahia Zare Mehrjerdi, Maryam Dehghan,
Volume 24, Issue 1 (2-2013)
Abstract

Abstract In the dynamic and competitive market, managers seek to find effective strategies for new products development. Since There has not been a thorough research in this field, this study is based on a review on the risks exist in the NPD process and an analysis of risks through FMEA approach to prioritize the existent risks and a modeling behavior of the NPD process and main risks using system dynamics. First, we present new product development concepts and definition. We then based our study on a literature review on the NPD risks and then provide an FMEA approach to define risks priority. Using the obtained main risks, we model the NPD process risks applying system dynamics to analyze the system and the risks effect on. A safety clothing manufacturer is considered as a case study.
Moharram Habibnejad Korayem, Arastoo Azimi, Ali Mohammad Shafei,
Volume 24, Issue 3 (9-2013)
Abstract

In this research the sensitivity analysis of the geometric parameters such as: length, thickness and width of a single link flexible manipulator on maximum deflection (MD) of the end effector and vibration energy (VE) of that point are conducted. The equation of motion of the system is developed based on Gibbs-Appel (G-A) formulation. Also for modeling the elastic property of the system the assumption of assumed modes method (AMM) is applied. In this study, two theories are used to obtain the end-point MD and VE of the end effector. Firstly, the assumption of Timoshenko beam theory (TBT) has been applied to consider the effects of shear and rotational inertia. After that, Euler-Bernoulli beam theory (EBBT) is used. Then Sobol’s sensitivity analysis method is applied to determine how VE and end-point MD is influenced by those geometric parameters. At the end of the research, results of two mentioned theories are compared.
Mohammadjafar Tarokh, Mahsa Esmaealigookeh,
Volume 24, Issue 4 (12-2013)
Abstract

Abstract Customer Lifetime Value (CLV) is known as an important concept in marketing and management of organizations to increase the captured profitability. Total value that a customer produces during his/her lifetime is named customer lifetime value. The generated value can be calculated through different methods. Each method considers different parameters. Due to the industry, firm, business or product, the parameters of CLV may vary. Companies use CLV to segment customers, analyze churn probability, allocate resources or formulate strategies related to each segment. In this article we review most presented models of calculating CLV. The aim of this survey is to gather CLV formulations of past 3 decades, which include Net Present Value (NPV), Markov chain model, probability model, RFM, survival analysis and so on.

Page 1 from 3    
First
Previous
1