-
Research Article
Measuring Total Factor Productivity in General Technical Progress Framework
Issue:
Volume 13, Issue 6, December 2024
Pages:
181-192
Received:
7 October 2024
Accepted:
25 October 2024
Published:
12 November 2024
DOI:
10.11648/j.ajtas.20241306.11
Downloads:
Views:
Abstract: The classical Solow's total factor productivity accounting assumes that technical progress is Hicks neutral, which is a special situation in the reality of world economy. This paper expands the setting of technical progress into general technical progress framework, which can cover Hicks neutral technical progress, Harrod neutral technical progress, Solow-neutral technical progress, and various factor-biased technical changes. According to the principle of statistical index number, this paper decomposes the output index into a total factor input index and a total factor productivity index, and adopts normalized CES production function with factor-augmenting technical progress to derive the calculation formulas of the total factor input index and the total factor productivity index, and constructs a new economic growth accounting system, and finds the counteraction and compensation mechanism for diminishing marginal returns. If the factor substitution elasticity is 1 or there is no technical progress bias and factor allocation bias, then the new accounting equation degenerates into the classic Solow growth accounting equation. The new accounting system can measure the influence of total factor input and total factor productivity to economic growth, but also can measure the influences of factor input intensity and factor allocation bias in the growth rate of total factor input, and the influences of technical progress intensity and technical progress bias in the growth rate of total factor productivity. Therefore it is more precise and accurate than classical method.
Abstract: The classical Solow's total factor productivity accounting assumes that technical progress is Hicks neutral, which is a special situation in the reality of world economy. This paper expands the setting of technical progress into general technical progress framework, which can cover Hicks neutral technical progress, Harrod neutral technical progress...
Show More
-
Research Article
Assessing the Quality of Ordinary Least Squares in General Lp Spaces
Kevin Hoffman,
Hugo Moises Montesinos-Yufa*
Issue:
Volume 13, Issue 6, December 2024
Pages:
193-202
Received:
20 September 2024
Accepted:
18 October 2024
Published:
18 November 2024
Abstract: In the context of regression analysis, we propose an estimation method capable of producing estimators that are closer to the true parameters than standard estimators when the residuals are non-normally distributed and when outliers are present. We achieve this improvement by minimizing the norm of the errors in general Lp spaces, as opposed to minimizing the norm of the errors in the typical L2 space, corresponding to Ordinary Least Squares (OLS). The generalized model proposed here—the Ordinary Least Powers (OLP) model—can implicitly adjust its sensitivity to outliers by changing its parameter p, the exponent of the absolute value of the residuals. Especially for residuals of large magnitude, such as those stemming from outliers or heavy-tailed distributions, different values of p will implicitly exert different relative weights on the corresponding residual observation. We fitted OLS and OLP models on simulated data under varying distributions providing outlying observations and compared the mean squared errors relative to the true parameters. We found that OLP models with smaller p's produce estimators closer to the true parameters when the probability distribution of the error term is exponential or Cauchy, and larger p's produce closer estimators to the true parameters when the error terms are distributed uniformly.
Abstract: In the context of regression analysis, we propose an estimation method capable of producing estimators that are closer to the true parameters than standard estimators when the residuals are non-normally distributed and when outliers are present. We achieve this improvement by minimizing the norm of the errors in general Lp spaces, as opposed to min...
Show More
-
Research Article
Self-Exciting Threshold Autoregressive (SETAR) Modelling of the NSE 20 Share Index Using the Bayesian Approach
Issue:
Volume 13, Issue 6, December 2024
Pages:
203-212
Received:
11 October 2024
Accepted:
4 November 2024
Published:
26 November 2024
Abstract: The analysis and interpretation of time series data is of great importance across different fields, including economics, finance, and engineering, among other fields. This kind of data, characterized by sequential observations over time, sometimes exhibits complex patterns and trends that some commonly used models, such as linear autoregressive (AR) and simple moving average (MA) models, cannot capture. This limitation calls for the development of more sophisticated and flexible models that can effectively capture the complexity of time series data. In this study, a more sophisticated model, the Self-Exciting Threshold Autoregressive (SETAR) model, is used to model the Nairobi Securities Exchange (NSE) 20 Share Index, incorporating a Bayesian parameter estimation approach. The objectives of this study are to analyze the properties of the NSE 20 Share Index data, to determine the estimates of SETAR model parameters using the Bayesian approach, to forecast the NSE 20 Share Index for the next 12 months using the fitted model, and to compare the forecasting performance of the Bayesian SETAR with the frequentist SETAR and ARIMA model. Markov Chain Monte Carlo (MCMC) techniques, that is, Gibbs sampling and the Metropolis-Hastings Algorithm, are used to estimate the model parameters. SETAR (2; 4, 4) model is fitted and used to forecast the NSE 20 Share Index. The study's findings generally reveal an upward trajectory in the NSE 20 Share Index starting September 2024. Even though a slight decline is predicted in November, an upward trend is predicted in the following months. On comparing the performance of the models, the Bayesian SETAR model performed better than the linear ARIMA model for both short and longer forecasting horizons. It also performed better than its counterpart model, which uses the frequentist approach for a longer forecasting horizon. These results show the applicability of SETAR modeling in capturing non-linear dynamics. The Bayesian approach incorporated for parameter estimation advanced the model even further by providing a flexible and robust way of parameter estimation and accommodating uncertainty.
Abstract: The analysis and interpretation of time series data is of great importance across different fields, including economics, finance, and engineering, among other fields. This kind of data, characterized by sequential observations over time, sometimes exhibits complex patterns and trends that some commonly used models, such as linear autoregressive (AR...
Show More
-
Review Article
Methods of Optimal Accelerated Life Test Plans: A Review
Issue:
Volume 13, Issue 6, December 2024
Pages:
213-226
Received:
1 November 2024
Accepted:
21 November 2024
Published:
9 December 2024
Abstract: Accelerated life tests (ALT) have been used as a powerful tool to obtain time based information on the life span or performance characteristics over time of the items. Tests are performed under higher stressed levels instead of under normal use constraints. Obtained information as tests results are used to make predictions about life span over time at the real use. Accelerated testing under different stresses continuously helps in improving product reliability and in formulating warranty policies. This paper aims to provide insight into the methods of optimal acceleration life test designs. We first present a review of literature on optimum design of accelerated life tests in chronological order over the last six decades. Second, we present life time distributions with their mean lifetime or qth quantity and life stress relationship with their different factors level. We also present a flow chart outlining the process of accelerated life test planning. Further, we present the estimation methods commonly employed in the field of accelerated life testing, including least squares estimation, maximum likelihood estimation, graphical estimation, and Bayesian estimation. Finally, we provide an analytical discussion on accelerated life testing. This review aims to assist researchers, reliability engineers, and scientists in enhancing the design and planning of accelerated life tests.
Abstract: Accelerated life tests (ALT) have been used as a powerful tool to obtain time based information on the life span or performance characteristics over time of the items. Tests are performed under higher stressed levels instead of under normal use constraints. Obtained information as tests results are used to make predictions about life span over time...
Show More
-
Research Article
Statistical Network Analysis of Macroeconomic Variables in Ghana
Issue:
Volume 13, Issue 6, December 2024
Pages:
227-241
Received:
13 November 2024
Accepted:
23 November 2024
Published:
16 December 2024
DOI:
10.11648/j.ajtas.20241306.15
Downloads:
Views:
Abstract: In the rapidly evolving economic landscape of Ghana, understanding the intricate interdependencies between macroeconomic variables is pivotal for informed policymaking and strategic economic planning. The study employed network analysis to enhance our comprehension of Ghana's macroeconomic dynamics. Data was sourced from the world development indicators. Initially, a statistical network was constructed to represent the interconnections between Ghana's principal macroeconomic variables using partial correlation matrix, offering a visual and analytical perspective of their relationships. Subsequently, centrality measures and other network analysis tools were utilized to identify and quantify the influence of key economic indicators within this network. Results showed that Exports, Inflation, Exchange rate, Gross Domestic Saving, Manufacturing and Gross National Expenditure played a significant role in the network. However, Agriculture and Imports were identified as most influential variables with high centrality scores across all centrality measures. Finally, Exponential Random Graph Model was employed to provide a comparative baseline, shedding light on the uniqueness or randomness of the observed interrelationships. The significant parameters in the model include the presence of edges between nodes and the presence of generalized geodesic triads (gwesp), which capture the tendency for nodes to form connections based on common neighbors. The findings also revealed that there is a probability of 16.19% for a relationship to exist between two macroeconomic variables if they are both connected to the same third variable.
Abstract: In the rapidly evolving economic landscape of Ghana, understanding the intricate interdependencies between macroeconomic variables is pivotal for informed policymaking and strategic economic planning. The study employed network analysis to enhance our comprehension of Ghana's macroeconomic dynamics. Data was sourced from the world development indic...
Show More
-
Research Article
Modelling and Forecasting Unemployment Trends in Kenya Using Advanced Machine Learning Techniques
Issue:
Volume 13, Issue 6, December 2024
Pages:
242-254
Received:
21 October 2024
Accepted:
18 November 2024
Published:
18 December 2024
DOI:
10.11648/j.ajtas.20241306.16
Downloads:
Views:
Abstract: This study conducts a comprehensive analysis exploring the relationship between key macroeconomic indicators and the unemployment rate, alongside evaluating the predictive accuracy of modern regression models. The correlation analysis examines the association between unemployment rate percentages and five macroeconomic variables: real Gross Domestic Product (GDP) growth, gross public debt as a percentage of GDP, population size, government revenue as a percentage of GDP, and government expenditure as a percentage of GDP. The results highlight significant correlations, particularly the strong positive relationship between unemployment rates and gross public debt (% of GDP) (0.8417), while real GDP growth shows a weak correlation (0.0783), indicating that debt levels may be a more crucial determinant of unemployment variations in this context. Additionally, a comparison of modern regression models, namely Support Vector Regression (SVR), Neural Network Regression, and Bayesian Regression, is conducted based on their performance metrics: Mean Absolute Error (MAE), Root Mean Square Error (RMSE), R-Squared, Akaike Information Criterion (AIC), and Bayesian Information Criterion (BIC). Among the models, Support Vector Regression outperforms the others, with the lowest MAE (0.0823), RMSE (0.0878), and the highest R-Squared value (0.9915), along with notably favourable AIC (-100.2072) and BIC (-69.4268) scores. Neural Network Regression also delivers competitive performance with a slightly higher MAE and RMSE but a similarly strong R-Squared (0.9887). In contrast, Bayesian Regression exhibits weaker predictive power with higher error metrics (MAE = 0.2579, RMSE = 0.3109) and a significantly lower R-Squared (0.8806), AIC (28.0408), and BIC (36.8352). These findings underscore the efficacy of SVR in predictive modelling for macroeconomic datasets, suggesting its suitability for unemployment rate forecasting.
Abstract: This study conducts a comprehensive analysis exploring the relationship between key macroeconomic indicators and the unemployment rate, alongside evaluating the predictive accuracy of modern regression models. The correlation analysis examines the association between unemployment rate percentages and five macroeconomic variables: real Gross Domesti...
Show More
-
Research Article
Recursive Kernel Density Estimators Under Censoring Verifying an β-mixing Dependence Structure
Issue:
Volume 13, Issue 6, December 2024
Pages:
255-265
Received:
9 November 2024
Accepted:
29 November 2024
Published:
18 December 2024
DOI:
10.11648/j.ajtas.20241306.17
Downloads:
Views:
Abstract: In this paper, we consider the nonparametric recursive kernel density estimator on a compact ensemble when observations are censored and β-mixing. In this type of model, it is widely recognized that the traditional empirical distribution does not allow the densities F and G to be efficiently evaluated. Thus, Kaplan and Meier suggested a consistent estimator of Gnto properly estimate G. Let {Tk, k ≥ 1} be a strictly stationary sequence of random variables distributed as T. We aims to establish a strong uniform consistency on a compact set with a rate of recursive kernel estimator of the underlying density function f when the random variable of interest T is right censored by another C variable. In censoring, the observation is only partially known, which means that there are only the n pairs (Yi, δi), Yi= min(Ti, Ci) and δi= II{Ti≤Ci}, where IIA, where the indicator function for event A. Firstly, we propose the uniform convergence of this recursive estimator towards the density f. Then, we showed the veracity of our results by establishing all the necessary proofs. In other words we will prove our main result by establishing three lemmas. And finally we validated our theoretical results with a simulation study.
Abstract: In this paper, we consider the nonparametric recursive kernel density estimator on a compact ensemble when observations are censored and β-mixing. In this type of model, it is widely recognized that the traditional empirical distribution does not allow the densities F and G to be efficiently evaluated. Thus, Kaplan and Meier suggested a consistent ...
Show More