## ABSTRACT

The aim of this dissertation is mainly to analyzethe risk in the New York Stock Exchange using Statistical aspects and the computation of Value at Risk. The technique of Value at Risk or Value at Risk (VaR) is a methodology to standardize the calculation of the various risks that occur in a business e.g. banks, stock exchanges and corporations. The risk is defined as the probability of obtaining a result different than expected, the factors that will depend on the position of the entity, the risk factor under consideration and the time of calculation. Thus, the VaR seeks to establish quantitatively the risk in monetary units, defined as the maximum probable loss in one position for a specified range, according to market conditions where the risk factor is negotiated. The VaR methodology is the natural development of Markowitz Portfolio Theory fifties. His main thrust in the financial world received from JP Morgan, when asked by a senior executive probable maximum loss in twenty-four hours, and the emerging post-closure report, which today gives its name to the application methodology developed for this (Risk Metrics) financial institution.

The VaR can be understood in different ways, each of which constitutes a definition. As the proposed Risk Metrics, to emphasize, first, that this is a study of the market risk and its measurement through the VaR methodology, and within it, appear the technical characteristics of the proposed JP Morgan, and the possibility that this entity supplies the required database. Its application is based on an accounting mark-to-market, no price estimates, but carried on returns.This dissertation can distinguish two parts, one for the analysis of the VaR methodology, and one that analyzes the risk in the New York Stock Exchange using Statistical aspects and the computation of Value at Risk.

The latter part is accompanied by a comprehensive statistical study, based on which, in relation to the behavior of volatility and correlations between risk factors; Statistical highlights in this section the implementation of the decomposition employed in the estimation of the covariance matrix for both delta-normal to the Monte Carlo simulation approach. In short, according to which the initial parameters and are based on the model used, there are endless possibilities for risk estimation according to the VaR methodology, making it difficult to choose a specific practical application, so that each entity must search to meet your needs and objectives. From this it follows that the VaR is not a single value but will fluctuate depending on the initial decisions made about the model, so the risk analysis through this methodology would be incomplete if it does not accompany other analyzes complementary. The aim is therefore to establish the goodness of the VaR model used, which is a statistical objective, complete the information provided, with the intent to facilitate decision making.

INTRODUCTION

What is the value at risk?

The value at risk , or VaR (Value at Risk) is a technique used to estimate the probability of portfolio losses based on statistical analysis of historical data trend and price volatility. For a given significance level and time horizon, the portfolio value at risk is the estimate of the maximum potential loss in "normal" market conditions without changes in the portfolio over the horizon. The VaR is today one of the tools used as a measure of risk by regulatory agencies around the world as well as banks, financial institutions and portfolio managers. Value at Risk (VaR) is a concept that started gaining popularity in the early 1980s to be used as a measure of risk in their portfolios by major financial institutions in the developed countries. In the mid-1990s the use of VaR as a risk measure was extended to the Regulators, which was reflected in the recommendation made by the Basel Committee on Banking Supervision in 1995 that Banks utilizes calculation VaR to estimate capital requirements based on the market risk they assumed. The Federal Reserve of the United States adopted this recommendation that year. The popularity of using the Value at Risk for establishing risk management strategies is undoubtedly that it is a simple concept whose interpretation is easy to do: the best possible estimate of the loss given a portfolio, a level of trust and given time horizon. However, it has certain shortcomings which also suffers from strong criticism from some quarters, including: assumes constant portfolio composition, which makes it not be useful VaR portfolios with high transaction calculation focuses on the central part of the distribution of the data and ignores the tails, assuming a "normal" environment not including extreme factors such as certain catastrophes or other events that occur in the very long term and therefore fall outside the time horizon calculation.

Calculating Value at Risk

The VaR answer the question: how much can I lose in N time with probability (1-α) % Where N is the default time horizon and (1-α) % confidence level for the VaR calculation is performed. The choice of α and N values is essential in calculating the value at risk. These parameters are commonly combined with values of 1% and 5% for the confidence level and one day and two weeks respectively for the time horizon, although other combinations are also used. Choosing a value of α time horizon depends on the use to which it will give to VaR and portfolio characteristics. For example, for a portfolio in which a large number of operations performed in a few hours, they must choose a short time frame, for example two hours, while others should be more stable portfolio choose chords horizons, for example, a day (the usual), 1 year, etc.. Likewise, the value of α should be chosen according to the use of VaR and the characteristics of the portfolio, for example, if the calculation of VaR is aimed at compliance imposed by regulatory agencies, the calculations are usually performed for a high level of confidence (99%, α = 0.01 or α = 1%).

In short, for a given time horizon and given confidence level, we should calculate the lower α percentile bell distribution of possible portfolio losses. The cut of this bell percentile distribution of possible values of the portfolio will give us the lowest portfolio value expected with a (1-α) % confidence. So we can define the value at risk as:

VaR = V 0 - V c

Where:

VaR is the VaR

V 0 is the current value of the portfolio

V c is the maximum possible loss with probability (1-α) %

For example, if we consider a time horizon of one day and a confidence level of 99% (α = 1%). The point V c is the lowest value that could take the portfolio with 99% confidence, the maximum possible loss (value at risk) of V 0 - V c. Passing this to a mathematical language, the value at risk (VaR) is defined as: "For a given confidence level the VaR of a portfolio in the confidence level α l is the smallest value such that the probability that L exceeds the loss is not greater than L (1 - α) %

VaR α (L) = - inf {l ∈ℜ: P (L> l) ≤ 1 - α} = - inf {l ∈ℜ: F L (l) ≥ α}

The left equality is the definition of value at risk while the equal right assumes a probability distribution defined previously, which makes it true only for parametric VaR

LITERATURE REVIEW

New York Stock exchange (NYSE)

The stock exchange (stock exchange or financial) is the regulated market in negotiable securities representing debt or equity shares and financial derivatives, such as futures, options, covered warrants. Has particular relevance for both the volume of trading, both for their specific economic function, which is to channel the savings of households and institutional investors to the business, which place their capital in the stock market shares or loans to finance investment. Stock prices reflect investors' forecasts on profits of listed companies and are an important indicator to guide the flow of savings into investment with the best return prospects. The stock exchange shall ensure investment in securities with high liquidity, because the volume and regularity of the transactions enable you to immediately resell the securities on the market that you do not want to keep in the portfolio. However, since prices vary continuously, the realizable value involves the risk of capital loss, (or, under favorable conditions, the possibility of speculative gains). Depending on the market valuation of an action on the expected future profits of the issuing company, the quality of corporate information is essential for the proper functioning of the stock market and for the protection of investors. The normal activities of the bag, which was affected by ups and downs future price, reduces the sharp price fluctuations and can, exert a useful moderating. Recurrent episodes of speculative bubbles, the rise in prices is fueled by expectations of further rises in prices, which no longer correspond to a realistic assessment of future performance. It is possible, then, the excessive rise in the price of securities and the consequent collapse of the courses, which can lead to panic. The one and the other phenomenon can also be the result of the spread of rumors biased (insider trading). The supervision that takes place on the stock market supervisory authority (acting nature and different in different countries) aims to promote transparency and information symmetry, for the protection of investors.

Any large formations such as the state, are symbols, flags, anthems, is an essential feature of different events taking place under the leadership of the state, about the execution of state powers, they know, they know, they associate it with, however it is inanimate, in a sense static, unchanging over time, they change usually indicates what that fundamental changes in the methods of social control, while others of them, for example the U.S. White House, the Kremlin (Russia), primarily in the mass consciousness embody power, power of the state, they reside in the dynamics occur daily important activities significantly affecting the situation in society, maybe it's just the tip of the iceberg, and yet exactly the kind of characters are also the focus of media attention. It has become such a dynamic symbol for the U.S. financial industry in general and the world of New York Stock Exchange (born New York Stock Exchange, NYSE) - the main U.S. stock exchanges. Being the largest in the world it has become a symbol of financial power of the USA. On the exchange determined internationally renowned Dow Jones shares of industrial companies (born Dow Jones Industrial Average), as well as index NYSE Composite. That's about it and will be discussed in this paper.

Modern structure and the listing requirements of the New York Stock Exchange

For many years, the NSE was a voluntary association; since 1972 she is a non-profit corporation owned by its members. Since 1953 the number of members of the NSE remains unchanged - 1366. To become a member of the exchange, it must be purchased on location, price of which varies depending on supply and demand. In recent decades, the minimum price of seats on the NSE was 35 thousand dollars (in 1974), maximum - 1150 thousand dollars (in 1987). Space can be rented to those who meet the requirements of the Exchange. About one-third of the seats on the exchange rented. Member New York Stock Exchange can only be individuals only if they purchased a place on the exchange. Exchange members can act functions specialist dealer, broker, operating or registered broker. Members of the New York Stock Exchange transactions concluded directly between themselves, without the help of the broker. On the New York Stock Exchange is conducted mainly cash trading, futures contracts are limited to options transactions, spread the purchase of securities on credit, with cash about 30% of market value. To be admitted to the quote, the company must meet the following requirements:

- Profit before tax for the last year - $ 2.5 million;

- Profit for the previous 2 years - $ 2.0 million;

- Net value of tangible assets - 18.0 million;

- The number of shares in public ownership - for $ 1.1 million;

- The market value of shares - 18.0 million;

- The minimum number of shareholders holding 100 shares or more

In addition, the average monthly volume of trading in shares of the issuer shall be not less than 100 thousand dollars in the last 6 months. Listing on the NSE had 5248 shares of issuers (approximately 9700 common and preferred shares).

Mathematical Definition of VaR

The Value at Risk (VaR) is a measure of the risk associated with a market ' financial activity. It represents, on a given time horizon (usually equal to 1 day or 10 days) and with a given level of probability or statistical confidence (usually 95% or 99%), the maximum loss arising from the detention of conceivable financial object of evaluation. This loss reflects the interaction of a number of factors that the model is assumed to be related to each other in determining the yield. The time horizon chosen for the calculation of VaR reflects the minimum time necessary to liquidate the investment in the event of loss (which may be precisely equal to 1 day on very liquid markets, or 10 days if it is assumed a greater time required for the mobilization). The Value at Risk is also used, and most importantly, to determine the minimum level of capital to cover losses on financial assets generated by market risks. This measure is therefore applicable in the calculation of the market risk of portfolios in equities, bonds, investment in foreign currencies and derivative financial. The VaR can then make an educated guess on the uncertain value of the financial asset at the end of the time horizon chosen for the evaluation, with a given level of statistical confidence. Suppose, for example, to calculate the VaR time horizon from t 0 to t 0 + n of a portfolio consisting of a single financial asset having a probability distribution of the performance of Gaussian with mean μ = 100 € and standard deviation σ = € 39.39. Suppose that the interval around described by μ ± 1.65 σ (i.e. the interval between 35 € and 165 €) will distribute 90% of the statistical distribution of the returns of the financial asset. If the portfolio at the time of the assessment, is quoted exactly to its equilibrium value (i.e. € 100 in the example in Figure 1), we have, over a given time horizon (e.g. 1 day), a VaR (maximum loss potential) -65 €, with a statistical confidence level of 95%. In other words, between t 0 and t 0 + n, the position may drop to less than 35 € in only 5% of cases, and with a probability of 95% will be rather greater than this value.

Both F X (x) = P (X ≤ x) = p, assumed here for simplicity, continuous or cumulative distribution function (➔ probability distribution) of the gain random X intermediary in a certain period of time. Its meaning is clarified by the following example: If time is a week and F X (-1 million) = 0.05, there is a 5% chance that the result is less economic weekly (or at best equal to) - 1 million euro, i.e. that in a week you record a loss of at least 1 million (negative values of x mean gain losses). Fixed period of time which is reportedly the gain X, and the probability p to a level usually very low (between 0.1% and 1%), the value at risk (value at risk or V @ R (X , p)) is the opposite of the amount c (only in the case where the cumulative is strictly increasing and continuous), which satisfies the F X (c) = P (X ≤ c) = p, i.e., indicating with F X -1 (p) the inverse of the cumulative, the verification c = F -1 X (p); more formally V @ R (X, p) = - F X -1 (p). So perfectly equivalent, but emphasizing the fact that these are findings of subjective probabilities (degrees of confidence), we say that - F X -1 (p) is the value at risk with confidence q = 1-p of the distribution X. If X is the variable results of operations of a financial institution in a certain period of time, the PV @ R measures the maximum loss (or worst outcome) that can occur with confidence 1 - p; said another way, there is a degree of confidence 1 - p that the loss does not exceed the value at risk.

Methods used in the calculation of VaR in the NYSE

There are several methods of calculating the value at risk and all can be classified into two categories:

- Nonparametric methods: based on historical data to construct the distribution

- Parametric methods: the distribution of data is assumed

It is precisely on the basis of market risks to which a bank is exposed, as measured by VaR, that the supervisors need to hold - in the form of provision to appropriate budgetary reserves unavailable - a minimum capital to cover these risks (on a daily basis or other periods). If a bank does not have internal models for the calculation of VaR, developed by the relevant internal risk management and validated by the supervisory bodies responsible for monitoring adequacy (the national central banks ), VAR models are directly suggested by the legislation and indicated Reference Basel III (standard models), which sets out the processes to calculate rigidly structured and standardized.The most common methods for calculating VaR are a) the historical simulation method, b) the method of normal and c) the Monte Carlo method.

a) Historical simulation method

This methodology identifies the VaR as x%-percentile of the distribution of historical returns of the financial asset. This is the easiest to use because it involves the use of a simple "history" of the yield of the financial asset and assumes that the behavior of past performance will resurface in the future. He does not need any assumption about the possible probability distribution of future returns. Once constructed the time series of these returns, the VaR is identified as the left tail corresponding to the confidence level used for the calculation. This confidence interval ensures that the x% of the realizations of the random variable "performance of" remain above the VaR itself, while only 1-x% of the cases this random variable will result in a worse performance.

b) Normal Method

It is based on the assumption that the underlying market factors to model all follow a normal distribution. As a result, the probability distribution of gains and losses arising from the detention of the financial asset is a linear combination of the distributions of the underlying factors. In this way can be used to make statistical properties of multivariate normal distributions in the calculation of VaR; the normal method is based on the key concept of standard deviation of returns of financial assets, which in turn will depend on the individual standard deviations of the input factors of the model, and the correlation existing between them. The estimate of the correlation between the individual inputs factors of the model can in some cases be difficult due to the lack of a liquid market for the exchange of one or more factors, or because of the lack or poor quality of the historical data used for the estimate. In these cases one can resort to using a single parameter instead of n, which reasonably approximates all elements of variability originally considered (for example, instead of considering n securities equity able to influence the performance of the position, will be considered only the yield historical average of the stock market in which all the n shares are exchanged). The simplification of the number of n factors in the model through the use of k (<n) factors is also known as risk mapping activities.The limit of this method is represented by the hypothesis of normality for all parameters of the model underlying assumption that almost never adapts to the real situation to consider.

c) Method of Monte Carlo simulation

It is based on a concept very similar to that of a historical simulation, as their starting from historical data defines what is the probability distribution of the most suitable to describe the past behavior of the distribution of returns of the position. The first step is to identify the input factors that can affect the model in a more sensitive the performance of the financial asset; on each of these parameters will then be formulated hypothesis about its probability distribution. Subsequently, these parameters will be related to each other through the formulation of a mathematical model that defines its relationship with the yield of the financial asset: the input parameters are the independent variables of the model, and the yield of the financial asset the dependent variable. Having sanctioned the choice of the distribution that best describes the historical yield curve, we will use a pseudo-random number generator to produce hundreds or thousands of possible scenarios of evolution of the factors underlying the model which I determine to turn a random distribution of the returns of the position. Finally, the VaR will be calculated from this random distribution thus generated.The main limitation of this method is the fact that chooses to use, or distribution to estimate the parameters of such a model is not always easy. In addition, the calculation of VaR for positions dependent on many factors may require long processing times. The need to measure the risk of financial investments has led over time to the creation of different models of value at risk (VAR or Value at Risk) that point to calculate the maximum potential loss of the positions taken in the market.

The VAR is in particular a probabilistic measure based on the time horizon (N days) and the confidence level (X) returns the remaining amount of invested capital in case of the occurrence of a negative event possible. Other variables are the reference period (usually one day, but configurable on each maturity) and, above risk and volatility. As for the risk, there are at least six of the most well-known types identified by the doctrine.The Delta risk corresponds to the sensitivity with respect to changes in the price of an asset and the risk Range indicates the sensitivity to changes in the price of second-asset (or if you prefer the rate of change of the risk range).There is then the risk volatility, also called Vega Risk, connected precisely to the variation of the volatility of an asset, and finally the risk Theta, also called time decay risk, associated with the passage of time.Also to be counted correlation risk (basis risk consists in the sensitivity to price changes of a hedging instrument) and the risk Rho, which indicates the sensitivity to changes in the discount factor and is in fact also called the discount rate risk.Another crucial factor to consider in the evaluation of the VaR is the volatility that indicates the percentage change in prices over time and is generally defined by calculating the variance of returns, i.e. the mean square difference between performance and its average (s 2). If we consider W 0 the initial investment and W * α, t the value of the investment in the case of VAR is the maximum loss equal to:

VAR = W 0-W * α, t

Advantages of using VaR methodology in the NYSE

Among the advantages of the VAR is its applicability to all types of investment from the stock market to bonds, derivatives currencies.An example may make some 'clarity. If the portfolio, i.e. the set of initial investments, has a value of 100 million euro and a confidence level of 90% X, put the maximum acceptable loss of 5 million with a VAR to 5 days, it means that the expected ' investor has a 10% chance that after a week the same can record a loss of 5 million euro, then a VAR of 95 million Euros.Obviously there are more refined formulations that take into account the expected value of the portfolio at maturity and therefore fit yields or coupons within that time.To measure the risks is used for identifying them and translate them into cash flows elementary (related to individual assets) which will therefore be discounted. This transposition should however take into account the type of asset.The easiest and fastest way to calculate the VAR method is called the variance-covariance or parametric approach and has been popularized by JP Morgan in the nineties. This method assumes that the returns (profits or losses) have a normal distribution. This implies that, placed in a graph with the x-axis yields / potential losses and ordered the respective frequencies (i.e. the number of times that there have been losses or gains), you get a graph known as the Gaussian bell. It is a curve resulting from the formula.

This indicator is used since the 1990s to measure the risk of failure of banks and other financial institutions; to facilitate applications have been created algorithms and software for its calculation (famous one called 'Risk Metrics' of the investment bank JP Morgan). It is frequently the case in which it has to do with the distribution of the rate of return R period estimates of a portfolio, rather than X period estimates of the gain associated with it. In this case, denoting by W the initial value of the portfolio, the V @ R is obtained from the calculation - WF R -1 (p), ie multiplying by W @ V R as a percentage.

The conditional value at risk

VaR has two flaws are not negligible: it is not a measure of risk (➔ risk measures) consistent and requires a very accurate assessment (and moreover not simple) of the left tail of the distribution. To overcome at least the first of the two, is a proposal to replace it with the Tail V @ R, Tail V @ R (or conditional V @ R, V @ R conditional), formally defined as - E (X / X <- F -1 X (p)), or the opposite of the expected value of the loss upon the occurrence of a loss greater than the PV @ R, that is generated by a result in the left tail (defined by the probability p) of the distribution.

Calculatingthe Value Risk Return over Investment in the NYSE

In the evaluation of a financial investment, the calculation of the relationship between risk and return is of paramount importance. To do this, there are specific indicators that are able to provide one quantitative measure, very useful not only for the considerations about the goodness or less of a single investment but also to make comparisons between different instruments and solutions. In this regard, the indicators used are the VaR (Value at Risk), the 'Sharpe Ratio (Sharpe Ratio), VTE (Tracking Error Volatility) and' IR (Information Ratio). Each of them focuses on a particular aspect of the risk-return profile, responding to the different needs of financial analysis.

The Value at Risk (VaR) or Value-at-Risk is a statistical indicator of market risk, which summarizes the risk through a probability distribution of profits and potential losses. In practice, it is a measure of the maximum loss in which, with a certain probability, a financial portfolio could incur in a given time horizon. This parameter depends on factors such as the time period considered, the level of confidence that the probability (usually using 95% or 99%, but can be defined in its sole discretion) and the currency used to call it. To better understand, let's do an example. Suppose you hold a portfolio, get to know its market value at the beginning of the day and to consider a 1-day VaR of € 1,000 and with a confidence level of 95%. This means that at the end of the day we can expect, with a probability of 95%, a maximum loss not exceeding € 1,000, as long as you are in normal market conditions. On the contrary, we can expect that the value falls more than that amount with a probability of 5%. The popularity of the Value at Risk is related to its usefulness in uniting into single indicator different components of market risk: analysis of the VaR is in fact made on the basis of different aspects which may be exposed to a portfolio, as the equity risk or that related to interest rates and currency exchange.

Pros and Cons of the VAR Methods

Understandable explanation economist VaR reads as follows: "Expressed in monetary units estimate of which will not exceed the expected for this time of loss with a given probability." In essence, VaR - is the amount of losses on the investment portfolio over a fixed period of time, if not happen a favorable event. By "not favorable events" can be understood as various crises weakly predictable factors (changes in the law, natural disasters,) which may affect the market. As the time horizon is typically selected one, five or ten days due to the fact that a longer period to predict the behavior is extremely difficult to market; the level of acceptable risk (in essence CI) is taken to be 95% or 99%. Also, of course, fixed the currency in which we measure the loss. In calculating the values assumed that the market would behave in a 'normal' way. Graphically, this value can be illustrated as follows: Consider the most commonly used methods for calculating VaR, as well as their advantages and disadvantages.

Historical Simulation

If we take a historical simulation, already known from previous measurement values for a portfolio of financial fluctuations; for example, we have the behavior of the portfolio over the previous 200 days, on the basis of which we decide to calculate VaR Assume that the next day financial portfolio will behave as well as in one of the previous days. Thus, we obtain the outcomes of 200 the next day. Further, we assume that the random variable normal distribution, based on this fact, we understand that VaR - is one of the percentile of the normal distribution. Depending on what level of acceptable risk we took, select the appropriate percentile and as a result, we obtain the value we are interested.

The disadvantage of this method is the impossibility of constructing portfolios predictions about which we have information. Can also be a problem if the portfolio constituents change significantly in a short period of time?

Principal component method

For each financial portfolio can compute a set of characteristics that help to evaluate the potential of assets. These characteristics are called leading components, and usually are a set of partial derivatives of the portfolio price. To calculate the value of the portfolio is commonly used model Black - Scholes, about which I will tell next time. In a nutshell, the model represents the dependence of the evaluation of European call option on the time and its current value. Based on the behavior of the model, we can evaluate the potential of the option, the function of analyzing the classical methods of mathematical analysis (convexity / concavity intervals ascending / descending order, etc.). Based on data analysis, VaR is calculated for each of the components and the resulting value is built as a combination (usually a weighted sum) of each of the ratings.

Monte Carlo method

The Monte Carlo method is largely similar to the method of historical simulation; the difference is that the calculation is not based on real data and on randomly generated values. The advantage of this method is the possibility of considering, as a large number of situations, and emulation of market behavior in extreme conditions. A clear disadvantage is the large computational resources required to implement such an approach. When working with this technique usually used NoSQL storage and distributed computing based on Map Reduce. A good example of using Hadoop to calculate VaR can be found at the following link. Naturally, this is not the only method of calculating VaR There are both simple linear and quadratic models predict the price, and rather complicated method of variations-covariance, which I have not told, but visitors can find a description of the techniques in the following books.

Criticism techniques of each VAR Method

It is important to note that when calculating VaR accept the hypothesis of a normal market behavior, however, if this assumption were true, would crisis occurred every seven thousand years, but, as we see, it is absolutely not true. Nassim Taleb, known trader and mathematician, in the books "Fooled by Randomness" and "Black Swan" puts the current system of risk assessment to harsh criticism, and offers a solution, as the use of another system for calculating risk-based logonormalnom distribution. Despite the criticism, VaR is used quite successfully in all major financial institutions. It is noteworthy that this approach is not always applicable, due to which, other techniques have been developed with a similar idea but different calculation method (e.g., SVA). Given the criticism were developed modifications VaR, or based on other distributions, or on other methods of calculations at the peak of the Gaussian curve. But this is what I try to tell in a different time.

ANALYSIS AND FINDINGS

Computation of VaR using Monte Carlo method

Companies can use the values of VAR to generate reports for managers, shareholders and foreign investors, as VAR allows the aggregation of various market risks into one number that has monetary value. VAR estimates can be used to diversify the capital, setting limits, as well as evaluation of the company. Some banks score operations traders, as well as their remuneration is calculated at the rate of return per unit of VAR.Nonfinancial corporations can use the technique to evaluate VAR riskiness of cash flows and hedging decisions (capital protection against adverse price movements). So one of the interpretations of the VAR is the number of uninsured risk, which assumes the corporation; Among the first non-financial companies have started to apply VAR for market risk assessment, we can note a U.S. company Mobil Oil, the German company Veba and Siemens, Norwegian Statoil.Investment analysts use the VAR estimation for various projects. Institutional investors such as pension funds are used to calculate the VAR market risks. As was noted in a study New York University Stern School of Business, about 60% s U.S. pension funds use in their work methodology VAR; As already mentioned, the predetermined time interval [t, T], where t - current time, and a confidence level of p VAR loss has a time interval [t, T], which will occur with probability 1-p.Here's a simple example:

Let VAR daily value for this portfolio is a $ 2 million at 95% confidence level. This value of VAR means that in the absence of abrupt changes in market conditions one-day loss to exceed $ 2,000,000 in 5% (or 1 once a month, if we proceed from the fact that in the month of 20 working days).Mathematically speaking, VAR = VAR defined as the upper limit of one-sided confidence interval:

Probability (R (T) <-VAR}) = 1 - ,

where have confidence, R (T) has a rate of growth capital portfolio on the interval [t, T] with "a continuous process Interest":

R (T) = log (V (t + T) / V (t)),

where V (t + T) and V (t) has a capital value of the portfolio at time points t + T and t, respectively. In other words, V (t + T) = V (t) * exp (R (T)).

Note that R (T) is characterized by a random variable and so a certain probability distribution. VAR value is determined from the distribution of the increments of the portfolio as follows:

1 - = F (-VAR) = f (X) dx,

where F (X) = Probability (R x) is the distribution function of portfolio growth rate, f (X) is the density distribution of R (T).

If changes in equity portfolio characterized by a parametric distribution, the VAR can be calculated in terms of the parameters of this distribution; We illustrate the method of calculation of parametric VAR for example, a portfolio consisting of a single asset. Assume that the distribution yield of the asset is normal with parameters (Average) and (Standard deviation). Then the problem of calculating VAR amounts to finding (1 - )% Quartile of the standard normal distribution z :

1 - = = dz = N (z ) X = + Z ,

where is the density of the standard normal distribution, N (z) has distribution function of the normal distribution, g (x) is the density of the normal distribution with mean and the standard deviation .

The figure below shows the density of the normal distribution and contains quartile z . The area under the graph of the density function to the left of z (The area of the "left tail") is equal to 1 - .It is often assumed that the rate of growth of the asset = 0. Then

VAR =-V z ,

where V capital is the value of the portfolio at the current time t.

VAR = £ 1'000'000 * (0.0076-1.65 * 0.0458) = £ 68'012

We now consider the previous example of a portfolio consisting of "FTSE 100 Index" [2] But from the point of view of the investor, for which the functional currency is the U.S. dollar. Thus, the portfolio now consists of two "assets": a stock index, denominated in pounds sterling, and the exchange rate $ / £.

Let the current value of the exchange rate has 1.629 $ / £. Then capital investment portfolio in U.S. dollars has 1,000,000 / 1,629 = $ 613'874. Thus, the value of 1-month VAR stock index at 95% confidence level is:

VAR * = $ 613'874 (0.0076-1.65 * 0.045) = $ 40'915

Estimates of the standard deviation and the average exchange rate of $ / £ on the time interval 01/88 - 01/95 are 0.0368 and -0.001 respectively. Thus, a 1-month exchange rate value VAR $ / £ is:

VAR * = $ 613'874 (-0.001-1.65 * 0.0368) = $ 37'888

Now we are able to calculate the total portfolio VAR, using the fact that the variation in the portfolio of the two assets that have joint normal distribution is equal to the sum of the variations of each asset and double correlation between the assets multiplied by the standard deviation of assets:

(VAR ) = (VAR ) + (VAR ) 2 * * VAR * VAR Wherein is the coefficient of correlation between the growth rates of FTSE-100 index and the exchange rate of $ / £. Ratings is -0.2136, ie FTSE-100 index and the rate of $ / £ inversely correlated. Thus, the 1-month portfolio VAR at 95% confidence level there

VAR = = $ 49'470.

Thus, we can expect that the losses on the portfolio will be more than 8% of the initial capital Male in 5 of 100 months in the future.As can be easily seen, VAR portfolio was smaller amounts VAR index and exchange rate (equal to $ 78'803).This was due to portfolio diversification: as assets are negatively correlated, then the losses on one asset offset by gains on other assets.

Also, as expected, the value of VAR for example, the American investor in FTSE-100 index is greater in comparison with the VAR for the British investor (equal to £ 68'012 * 1.629 $ / £ = $ 41'751), investing their means the same "active index". This was due to the additional risk that carries the exchange rate $ / £.

In the above examples, the normal distribution was chosen for illustrative purposes because of the simplicity of the calculations. In practice, as we know, the increment in asset prices have, as they say, heavier "tails" compared to the normal law, ie in reality there is more "extreme" events compared with what would be expected in a normal distribution. VAR by nature, just dealing with the prediction of events from the "tails" of distribution [3] (Events "catastrophic risk", well known in the insurance and reinsurance business). Therefore it is more realistic to use the Pareto distribution type [4] For which the probability of large deviations given by the following expression:

Probability (R x) = F (X) ax when x .

Here a> 0 - constant, and > 0 is the so-called index of the "tail" [5] .

Modeling method based on historical data is to construct a portfolio allocation change R (T) based on historical data. In this case, is only one hypothesis about the distribution of portfolio returns on capital, "future" will behave as well as the "past". For Example 1, analyzed above, we have that a 5% quartile increments historical FTSE-100 index has -6.87% (marked by a vertical line in the histogram). Thus, using historical data, we obtain the following estimate VAR for a portfolio of "FTSE-100 index":

VAR = £ 1'000'000 * (-6.87%) = £ 68'700

(Compare with the value of VAR = £ 68'012 from Example 1).

The Monte Carlo method is to determine the statistical models for the portfolio and its simulation by generating random trajectories. VAR value is calculated from the distribution of rates of capital growth portfolio, similar to that depicted in the histogram for the FTSE-100 index, but the resulting artificial simulation.

VaRusing Historical method and normal method

Finding the VaR for each portfolio using the historical simulation method in the presence of a sample of sufficient depth is not working. For example, take the price for the last 5 years in the Russian equities and commodity futures traded on the NYMEX. We form a portfolio of number 1 of 9 different assets (3 types - Russian shares, 6 species - commodity futures), taken in equal parts, at prices affordable to the first moment of our dataset. The portfolio will include number 2 7 futures contracts. Further, at each time calculate the cost of the portfolio, which can be either fully <passive> or <actively diversified>. By the latter is meant portfolio redistributed so that the assets are always kept in equal shares. To the resulting number of VaR is calculated in the usual way.

Where the yield mx - expectation increment portfolio; risk VaR - meaning VaR; px - averaging period mx; f - a functional group selected according to the preferences of the investor (for example, the maximum annual yield with minimal risk daily, monthly maximum yield with minimal risk weekly, etc.). But what if we do not deal with a passive portfolio (buy and hold), and with a portfolio whose composition is constantly changing, and open positions are not limited to <long>? If the anticipated portfolio management will be haphazard, the only way to calculate the risk will be the formation of a number of random portfolios in the first step and the rearrangement of the portfolio in some random times. Averaging VaR all portfolios, we obtain the average risk of the use of this toolkit. Options <randomness> will depend only on the power of the computer on which the calculations are carried out. But when you operate tens of thousands of portfolios testing speed drops significantly, to the same if we want to find an optimal composition of the portfolio by the yield / risk, because of the absurdity of the notion <yield random portfolio>, this task becomes impossible.

Observations and analysis of the VaR methods

The method of scenario analysis examines the effect of changes in equity portfolio, depending on the changes in the values of risk factors (e.g. interest rates, volatility) or model parameters. Modeling is in accordance with certain "scenarios." So many banks assess the value of "PV01" their portfolios with a "fixed income" (fixed-income portfolios, i.e. portfolios consisting of tools "interest rate": bonds, interest rate forwards, swaps, etc.) which is calculated as the change in equity portfolio with a parallel shift of the yield curve by 100 basis points.

Unfortunately, the answer to the question of which of the methods of estimating VAR is the best, apparently, does not exist. Use of a method should be based on factors such as the quality of the database, ease of implementation of the method, the presence of high-speed computers, the requirements for the reliability of the results, etc.

In conclusion, it should be noted that the methodology VAR is not a panacea for financial losses. It only helps companies to submit whether the risks to which they are exposed to, those risks that they would like to take over, or think that they have undertaken. VAR cannot tell the manager of "how much risk you have to take," and could only say "have taken many risk". VAR and cannot be used instead, but in addition to other methods for risk analysis such as for example Shortfall - at - Risk (SAR, Average amount of losses), when not only interested in limiting the amount of capital below which the loss is to be expected with a certain probability, and the size of the loss.

CONCLUSIONS

At the same time, the VaR-trade system optimization tool set, and their share in the portfolio is sufficiently effective method. Indeed, management is conducted on clearly formulated set of rules, and therefore for a specific portfolio at the initial time there will be a yield curve, which can be analyzed in the future. After going through a number of different portfolios, we obtain different trajectories management process - many yield curves mechanical trading system, then we can calculate the VaR for each trajectory and use it we need to, for example, when calculating the theoretical subsidence for different portfolios, as well as finding the optimal VaR- portfolios in the trading strategy. Similarly, we can not only to diversify in terms of assets, but also by the trading strategies. Following the principle "do not put your eggs in one basket> experienced designer of automated trading systems is usually in the arsenal of a few different strategies.Properly allocate funds between them in terms of risk helps VaR-optimization. Knowing the risks and the expectation of return, you can <play> with some <shoulder> to increase the profitability of the system. Relatively simple implementation of all procedures (calculation VaR, portfolio management and trading strategies) is available with the ability to integrate Excel-VBA with the most popular trading platforms via DDE-or API-sharing. One of the options is as follows: each trading strategy programmed in a separate module VBA, create a main control module that communicates (able to get quotes as well as to display and remove the application) with the trading platform. The main module sends real-time quotes for the system modules, simultaneously collecting signals from them for transactions. The main module is responsible for the management of capital, i.e., distributes funds between strategies according to the rules laid down. At certain time intervals necessary to optimize strategies and assets comprising the portfolio;With proper implementation of the said principles of human presence on the computer, the leading trade becomes unnecessary. But significantly increases the role of the analyst, leading the development of the principles of trading strategies, and the programmer implements a mechanical trading system.

When dealing with illiquid securities have two ways: the first - to take into account the liquidity risk in the calculation of VaR, ie, in the simplest case to increase VaR on one instrument + spread; second - use <clever> nomination applications, ie, not at the market price, as it is usually implemented in mechanical trading systems, and on the purpose of the limited practical approximation to the theoretical VaR. The article was not considered a risk measure Shortfall or VaR (some expectation of losses, provided that the loss could exceed the VaR for the instrument and the time horizon), deservedly regarded as an excellent complement to VaR in situations where it is necessary to estimate the loss in an unlikely outcome. But the problem of finding a set of optimal strategies for this measure is useful only when using aggressive Money management. In other words, the yield curve increases smoothly for months, and then in the absence of funds to maintain the chosen management strategy takes one-step, but a very strong <drawdown> capital.

## BIBLIOGRAPHY

- Berkowitz, Jeremy, Peter Christoffersen, and Denis Pelletier. "Evaluating value-at-risk models with desk-level data." Management Science 57, no. 12 (2011): 2213-2227.

- Bessis, Joel. Risk management in banking. John Wiley & Sons, 2011.

- Brealey, Richard A. Principles of corporate finance. Tata McGraw-Hill Education, 2012.

- Chance, Don, and Roberts Brooks. Introduction to derivatives and risk management. Cengage Learning, 2012.

- Chiu, Chun-Hung, and Tsan-Ming Choi. "Optimal pricing and stocking decisions for newsvendor problem with value-at-risk consideration." Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on 40, no. 5 (2010): 1116-1119.

- Christoffersen, Peter F. Elements of financial risk management. Academic Press, 2012.

- Duffie, Darrell, and Kenneth J. Singleton. Credit risk: pricing, measurement, and management. Princeton University Press, 2012.

- Emerging Risk Factors Collaboration. "C-reactive protein, fibrinogen, and cardiovascular disease prediction." The New England journal of medicine 367, no. 14 (2012): 1310.

- Föllmer, Hans, and Alexander Schied. "Convex risk measures." Encyclopedia of Quantitative Finance (2010).

- Genevay, Muriel, Mari Mino-Kenudson, Kurt Yaeger, Ioannis T. Konstantinidis, Cristina R. Ferrone, Sarah Thayer, Carlos Fernandez-del Castillo et al. "Cytology adds value to imaging studies for risk assessment of malignancy in pancreatic mucinous cysts." Annals of surgery 254, no. 6 (2011).

- Grover, Varun, and Rajiv Kohli. "Cocreating IT Value: New Capabilities and Metrics for Multifirm Environments." MIS Quarterly 36, no. 1 (2012): 225-232.

- Hyde, Adam S., and Robert A. Wood. "The Persistence of Dominant-Firm Market Share: Raising Rivals' Cost on the New York Stock Exchange." (2014).

- Krause, Jochen, and Marc S. Paolella. "A Fast, Accurate Method for Value-at-Risk and Expected Shortfall." Econometrics 2, no. 2 (2014): 98-122.

- McNeil, Alexander J., Rüdiger Frey, and Paul Embrechts. Quantitative risk management: concepts, techniques, and tools. Princeton university press, 2010.

- McShane, Michael K., Anil Nair, and Elzotbek Rustambekov. "Does enterprise risk management increase firm value?." Journal of Accounting, Auditing & Finance 26, no. 4 (2011): 641-658.

- Michie, Ranald. The London and New York Stock Exchanges 1850-1914 (Routledge Revivals). Routledge, 2011.

- Mihaescu, Raluca, Moniek Van Zitteren, Mandy Van Hoek, Eric JG Sijbrands, André G. Uitterlinden, Jacqueline CM Witteman, Albert Hofman, MG Myriam Hunink, Cornelia M. Van Duijn, and A. Cecile JW Janssens. "Improvement of risk prediction by genomic profiling: reclassification measures versus the area under the receiver operating characteristic curve." American journal of epidemiology 172, no. 3 (2010): 353-361.

- Mitchell, Robert Cameron, and Richard T. Carson. Using surveys to value public goods: the contingent valuation method. Routledge, 2013.

- Niv, Yael, Jeffrey A. Edlund, Peter Dayan, and John P. O'Doherty. "Neural prediction errors reveal a risk-sensitive reinforcement-learning process in the human brain." The Journal of Neuroscience 32, no. 2 (2012): 551-562.

- O'Neill, Martin, and Wolfram Schultz. "Coding of reward risk by orbitofrontal neurons is mostly distinct from coding of reward value." Neuron 68, no. 4 (2010): 789-800.

- Pesaran, Bahram, and M. Hashem Pesaran. Time Series Econometrics Using Microfit 5.0: A User's Manual. Oxford University Press, Inc., 2010.

- Prahalad, Coimbatore K., and Venkat Ramaswamy. The future of competition: Co-creating unique value with customers. Harvard Business Press, 2013.

- Pratheepkanth, Puwanenthiren. "Capital structure and financial performance: Evidence from selected business companies in Colombo stock exchange Sri Lanka." Journal of Arts, Science & Commerce 2, no. 2 (2011): 171-183.

- Priori, Silvia G., Maurizio Gasparini, Carlo Napolitano, Paolo Della Bella, Andrea Ghidini Ottonelli, Biagio Sassone, Umberto Giordano et al. "Risk Stratification in Brugada SyndromeResults of the PRELUDE (PRogrammed ELectrical stimUlation preDictive valuE) Registry." Journal of the American College of Cardiology 59, no. 1 (2012): 37-45.

- Rockafellar, R. T., J. O. Royset, and S. I. Miranda. Superquantile regression with applications to buffered reliability, uncertainty quantification, and conditional value-at-risk. NAVAL POSTGRADUATE SCHOOL MONTEREY CA, 2013.

- Salcedo-sanz, Sancho, Merc Claramunt-bielsa, Jose Luis Vilar-zann, and Antonio Heras. "Statistical and Soft Computing Approaches in Insurance Problems." (2013).

- Saunders, Anthony, and Linda Allen. Credit risk management in and out of the financial crisis: New approaches to value at risk and other paradigms. Vol. 528. John Wiley & Sons, 2010.

- Schultz, Wolfram. "Review Dopamine signals for reward value and risk: basic and recent data." Behav. Brain Funct 6 (2010): 24.

- Street, Alexandre. "On the Conditional Value-at-Risk probability-dependent utility function." Theory and Decision 68, no. 1-2 (2010): 49-68.

- Wachter, Jessica A. "Can Time‐Varying Risk of Rare Disasters Explain Aggregate Stock Market Volatility?" The Journal of Finance 68, no. 3 (2013): 987-1035.