Tag:Residual
-
Complete pytorch code of depth residual shrinkage network
1. Basic theory Depth residual shrinkage networkIt is based on three parts, including residual network, attention mechanism and soft threshold. Its features include: 1) Because soft thresholding is a common step of signal denoising algorithm, the deep residual shrinkage network is more suitable for strong noise and high redundancy data. At the same time, the […]
-
R language value at risk: Arima, GARCH, delta normal method, rolling estimation of VaR (value at risk) and back test analysis of stock data
Original link: http://tecdat.cn/?p=24492 introduce The purpose of this analysis is to construct a process to correctly estimate var under the condition of variable volatility at a given time. Value at risk is widely used to measure the market risk of financial institutions. Our time series data include 1258 days of stock returns. In order to […]
-
R language used locally weighted regression (LOWESS) to diagnose logistic logistic regression and analyze residual
Original link:http://tecdat.cn/?p=22328 At present, regression diagnosis is not only used for the diagnosis of general linear models, but also gradually extended to the field of generalized linear models (such as logistic regression models). However, there are still many problems in the popularization and application because the assumptions of residual distribution between general linear models and […]
-
R language spline curve, decision tree, AdaBoost, gradient lifting (GBM) algorithm for regression, classification and dynamic visualization
Original link:http://tecdat.cn/?p=22336 Boosting algorithm is a method of integrating several classifiers into one classifierEnsemble method。 From the perspective of Econometrics The content of boosting can be understood from the perspective of econometrics.The goal here is to address: Loss function ℓ, and predictor set M. This is an optimization problem. The optimization here is carried out […]
-
Realization of Volatility Prediction in R language: arch model and har-rv model
Original text:http://tecdat.cn/?p=3832 Volatility is a key parameter in many pricing and risk models, such as BS pricing method or the calculation of value at risk. In this model, or in textbooks, the volatility in these models is usually regarded as a constant. However, this is not the case. According to academic research, volatility is a […]
-
Depth residual network + adaptive parameterized relu activation function (parameter adjustment record 21) cifar10 ~ 95.12%
In this paperParameter adjustment record 20On the basis of, the number of residual modules is increased from 27 to 60, and the performance of deep residual network RESNET + adaptive parameterized relu activation function on cifar10 data set is continued to be tested. The adaptive parameterized relu function is placed after the second convolution layer […]
-
Python uses GARCH, EGARCH, GJR-GARCH model and Monte Carlo simulation to predict stock prices
Original link:http://tecdat.cn/?p=20678 The prediction of stock price has been widely concerned by investors, governments, enterprises and scholars. However, the nonlinearity and non stationarity of data make the development of prediction model a complex and challenging task. In this article, I will explain how to GARCH,EGARCHand GJR-GARCH Model andMonte-Carlo Combined with simulation, To establish an […]
-
Depth residual network + adaptive parameterized relu activation function (parameter adjustment record 24) cifar10 ~ 95.80%
In this paperParameter adjustment record 23On the basis of, increase the number of convolution kernels, at least 64 and at most 256, and continue to test the depth residual network+Adaptive parameterized relu activation functionEffect on cifar10 dataset. Adaptive parameterized relu activation functionAfter being placed in the second convolution layer of the residual module, its basic […]
-
A dynamic relu activation function: adaptive parameterized relu (parameter adjustment record 1)
Adaptive parameterized relu is a dynamic relu activation function. It was submitted to IEEE Transactions on industrial electronics on May 3, 2019 and hired on January 24, 2020 (the first day of the Lunar New Year),The preview version was released on the IEEE official website on February 13, 2020。 In this paper, the depth residual […]
-
A dynamic relu: adaptive parameterized relu (parameter adjustment record 4)
Continued:A dynamic relu: adaptive parameterized relu (parameter adjustment record 3) Adaptive parameterized relu is a dynamic relu (dynamic relu), submitted to IEEE Transactions on industrial electronics on May 3, 2019 and hired on January 24, 2020 (the first day of the Lunar New Year),Published on IEEE official website on February 13, 2020。 In this paper, […]
-
[non negligible] dynamic relu: adaptive parameterized relu (parameter adjustment record 6)
Continued:Dynamic relus: adaptive parameterized relu (parameter adjustment record 5) Adaptive parameterized relu is a dynamic relu (dynamic relu), submitted to IEEE Transactions on industrial electronics on May 3, 2019 and hired on January 24, 2020,Published on IEEE official website on February 13, 2020。 This paper continues to adjust the super parameters and test the effect […]
-
[Harbin Institute of technology version] dynamic relu: adaptive parametric relu (parameter adjustment record 11)
Adaptive parametric relu is a dynamic relu, submitted to IEEE Transactions on industrial electronics on May 3, 2019 and hired on January 24, 2020,Published on IEEE official website on February 13, 2020。 In this paperParameter adjustment record 10On the basis of, the number of residual modules is increased from 27 to 60, and the effect […]