Category Archives: Uncategorized

  • 0

asdoc: Export Stata dta file to MS Word

Category:Uncategorized

Creating tables in Stata using asdoc is super-easy. In this short post, I’ll show how to use asdoc to export Stata data to MS word files. If you have not already installed asdoc, it can be installed by

ssc install asdoc

 

For exporting values from the data files, we can use the sub-command list of asdoc. We can also make the command conditional using the in and if qualifiers. In the following example, let us use the auto data set from the system folders

sysuse auto, clear
asdoc list price trunk mpg turn in 1/10 , replace

 

 

Explanation

In the above line of code, we wrote asdoc and the sub-command list. After that, we specified the names of the variables that we wanted to export to the MS word document. These variables included price trunk mpg turn. After that, we used the phrase in 1/10, that is the in qualifier to report observation 1 to 10. The option replace will replace any existing output file with the name Myfile.doc

 


  • 0

ASROL Version update: calculation of geometric mean and products in a rolling window and over groups in Stata

Category:Uncategorized

 

Description

asrol calculates descriptive statistics in a user’s defined rolling-window or over a grouping variable. asrol can efficiently handle all types of data structures such as data declared as time series or panel data, undeclared data, or data with duplicate values, missing values or data having time series gaps. asrol can be used for the calculation of a variety of statistics [see table of contents].

 

Installation

ssc install asrol

After installation, you can read the help file by typing:

help asrol

 

Options and defaults

Version 4.5.1 of asrol significantly improves the calculation of the product and the geometric mean.  Since both the statistics involve the multiplication of values in a given window, the presence of missing values and zeros present a challenge to getting the desired results. Following are the defaults in asrol to deal with missing values and zeros.

a. Missing values are ignored when calculating the product or the geometric mean of values.

b. Handling zeros in geometric mean: To be consistent with Stata’s default for geometric mean calculations, (see ameans), the default in asrol is to ignore zeros and negative numbers. So the geometric mean of 0,2,4,6 is 3.6342412, that is [2 * 4 * 6]^(1/3). And the geometric mean of 0,-2,4,6 is 4.8989795, that is [4 *6]^(1/2)

c. Handling zeros in products: Zeros are considered when calculating the product of values. So the product of 0,2,4,6 is 0

d. Option ignorezero: This option can be used to ignore zeros when calculating the product of values. Therefore, when the zero is ignored, the
product of 0,2,4,6 is 48

e. Option add(#) : This option adds a constant # to each values in the range before calculating the product or the geometric mean. Once the
required statistic is calculated, then the constant is substracted back. So using option add(1), the product of 0,.2,.4,.6 is 1.6880001 that is
[1+0 * 1+.2 * 1+.4 * 1+.6] – 1 and the geometric mean is .280434 is [(1+0 * 1+.2 * 1+.4 * 1+.6)^(1/4)] – 1.

 

Examples

Let us start with simple examples of calculating the geometric mean and products.  Our example data has stock prices, company identifiers (symbols) and time identifier (date) 

use http://fintechprofessor.com/stocks.dta, clear

* Generae numeric identifier for each firm
encode symbol, gen(id)

* Declear the data as panel data
tsset id date

* Create stock returns
gen returns = d.close/l.close

* Note the above formula for stock returns is analogous to
gen returns2 = (close - L.close) / L.close

 

 Geometric mean

 Now find geometric mean for stock returns, adding 1 before calculation and subtracting the same after calculation.  The calculations are made for each firm in a rolling window of 20 observations

bys id: asrol returns, stat(gmean) window(date 20) add(1)

 

Products – the case cumulative returns

Since we find products of (1+returns) for finding cumulative returns over n-periods, we can use the product function of asrol [read this blog entry for more more details on simple and log returns

Cumulative n-period simple returns =(1+simple_r1) * (1+simple_r2) 
*(1+simple_r3)  ... (1+simple_rn)  - 1     --- (Eq. 1)

 

The asrol command for the 20-periods rolling window cumulative returns would be:

bys id: asrol returns, stat(product) window(date 20) add(1)

 

 

Option Ignore Zeros

Option ignorezero or ig can be useful when we want to exclude zeros from the calculation of products. So let’s say we have the variable x that has values of 1, 2, 3, 0, and 5. Finding product of this variable will result in zeros. If there were circumstannce where we wish to find product of only non-zero values, the asrol command would be

asrol x, stat(product) ig
list

+--------------+
| x produc~x |
|--------------|
| 1 30 |
| 2 30 |
| 3 30 |
| 0 30 |
| 5 30 |
+--------------+

Without using the option ig, the product would be zero

asrol x, stat(product) gen(pro_withoutig)
. list
+-------------------------+
| x produc~x pro_wi~g |
|-------------------------|
| 1 30 0 |
| 2 30 0 |
| 3 30 0 |
| 0 30 0 |
| 5 30 0 |
+-------------------------+

 

A note on the methods used

Previous versions of asrol used log transformation of values for finding products and geometric mean. Version 4.5.1 onwards, the log transformation method is discontinued in the calculation of products and geometric means. The calculations now consider actual multiplications of the values in a given range. So the geometric mean is calculated as the nth root of the products of n numbers. 

or a set of numbers x1, x2, …, xn, the geometric mean is defined as

{\displaystyle \left(\prod _{i=1}^{n}x_{i}\right)^{\frac {1}{n}}={\sqrt[{n}]{x_{1}x_{2}\cdots x_{n}}}}

where the capital pi notation shows a series of multiplications.

Similarly,  the products are calculated as :

Product = x1 * x2  ... xn

  • 0

asreg: Get standard errors of the first stage regression of the Fama and MacBeth (1973) Procedure in Stata

Category:Uncategorized Tags : 

In the following example, we shall use asreg that can be installed from SSC by typing the following line in Stata command window

ssc install asreg

 

The problem

Let’s say that we wish to report different regression statistics from Fama and MacBeth (1973) regression such the standard errors of variables. Using the fmb option, asreg can efficiently estimate FMB regression. Further, it reports the regression coefficients of the first stage regression when option first is used with the option fmb.  However, it does not report other regression statistics. 

 

The solution

The good news is that we can still find different regression components using asreg. Since the first stage regression of the FMB procedure is the cross-sectional regression, we can use the bysort period prefix with asreg.

 

An example

Let us use the grunfeld data and estimate the FMB regression in the usual manner.

webuse grunfeld, clear
asreg invest mvalue kstock, fmb first

First stage Fama-McBeth regression results

  +------------------------------------------------------------+
  | _TimeVar   _obs       _R2   _b_mva~e   _b_kst~k      _Cons |
  |------------------------------------------------------------|
  |     1935     10   .865262    .102498   -.001995    .356033 |
  |     1936     10   .696394    .083707   -.053641    15.2189 |
  |     1937     10   .663763    .076514    .217722   -3.38647 |
  |     1938     10   .705577    .068018    .269115   -17.5819 |
  |     1939     10   .826602    .065522    .198665   -21.1542 |
  |     1940     10   .839255    .095399    .202291   -27.0471 |
  |     1941     10   .856215    .114764    .177465   -16.5195 |
  |     1942     10   .857307    .142825    .071024   -17.6183 |
  |     1943     10   .842064     .11861    .105412   -22.7638 |
  |     1944     10   .875515    .118164    .072207   -15.8281 |
  |     1945     10   .906797    .108471    .050221   -10.5197 |
  |     1946     10   .894752    .137948    .005413   -5.99066 |
  |     1947     10   .891239    .163927   -.003707   -3.73249 |
  |     1948     10   .788823    .178667   -.042556    8.53881 |
  |     1949     10   .863257    .161596   -.036965    5.17829 |
  |     1950     10   .857714    .176217   -.022096   -12.1747 |
  |     1951     10   .873773    .183141   -.112057    26.1382 |
  |     1952     10   .846122    .198921   -.067495    7.29284 |
  |     1953     10   .889261    .182674    .098753   -50.1525 |
  |     1954     10    .89845    .134512    .331375   -133.393 |
  |---------------------------------------------------------
Mean | 1944.5    10    .836907   .130605    .072958    -14.757 |
  +------------------------------------------------------------+
Fama-MacBeth (1973) Two-Step procedure           Number of obs     =       200
                                                 Num. time periods =        20
                                                 F(  2,    19)     =    195.04
                                                 Prob > F          =    0.0000
                                                 avg. R-squared    =    0.8369
------------------------------------------------------------------------------
             |            Fama-MacBeth
      invest |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
      mvalue |   .1306047   .0093422    13.98   0.000     .1110512    .1501581
      kstock |   .0729575   .0277398     2.63   0.016     .0148975    .1310176
       _cons |  -14.75697   7.287669    -2.02   0.057    -30.01024     .496295
------------------------------------------------------------------------------

 

An alternate way to first-stage

bys year: asreg invest mvalue kstock, se
bys year: keep if n == _N
+-------------------------------------------------------------------------------------------------------------+
| year _Nobs _R2 _adjR2 _b_mvalue _b_kstock _b_cons _se_mv~e _se_ks~k _se_cons |
|-------------------------------------------------------------------------------------------------------------|
| 1935 10 .86526202 .82676546 .10249786 -.00199479 .35603339 .0157931 .2148591 23.82794 |
| 1936 10 .69639369 .60964903 .08370736 -.05364126 15.218946 .0211982 .4125528 49.72796 |
| 1937 10 .6637627 .56769491 .0765138 .21772236 -3.3864706 .0218952 .4745161 62.14382 |
| 1938 10 .70557727 .62145649 .06801777 .26911462 -17.581903 .0220019 .2076121 33.62243 |
| 1939 10 .82660153 .77705911 .06552194 .19866456 -21.154227 .0131751 .1563955 29.10151 |
|-------------------------------------------------------------------------------------------------------------|
| 1940 10 .83925512 .79332801 .095399 .20229056 -27.047068 .0171077 .2206074 42.49812 |
| 1941 10 .85621485 .81513338 .11476375 .17746501 -16.519486 .0197202 .2338307 47.43406 |
| 1942 10 .85730699 .81653756 .14282513 .07102405 -17.618283 .0246973 .1966943 43.85369 |
| 1943 10 .84206394 .79693935 .11860951 .10541193 -22.763795 .0207092 .1887016 46.8604 |
| 1944 10 .87551498 .83994783 .11816422 .07220719 -15.828145 .0169881 .1537212 41.84578 |
|-------------------------------------------------------------------------------------------------------------|
| 1945 10 .90679731 .88016797 .1084709 .05022083 -10.519677 .0133214 .1254533 35.10524 |
| 1946 10 .89475165 .8646807 .13794817 .00541339 -5.9906571 .018637 .1600683 45.73243 |
| 1947 10 .89123943 .86016498 .16392696 -.00370721 -3.7324894 .0280743 .1285463 37.80575 |
| 1948 10 .7888235 .72848735 .1786673 -.04255555 8.5388099 .0463983 .1661775 52.39133 |
| 1949 10 .86325678 .82418728 .16159617 -.03696511 5.1782863 .0346516 .1268614 41.07802 |
|-------------------------------------------------------------------------------------------------------------|
| 1950 10 .85771384 .81706065 .17621675 -.02209565 -12.17468 .0393216 .1361792 46.6222 |
| 1951 10 .87377295 .83770808 .18314051 -.11205694 26.138157 .0358898 .1486738 53.00348 |
| 1952 10 .84612242 .80215739 .19892081 -.06749499 7.2928402 .052286 .1906835 67.84544 |
| 1953 10 .88926056 .85762072 .18267385 .09875335 -50.152546 .058579 .2164437 77.91569 |
| 1954 10 .89845005 .86943578 .13451162 .33137459 -133.39308 .0704524 .1932826 76.18067 |
+-------------------------------------------------------------------------------------------------------------+

 

Explanation

In the above lines of code, we estimated a yearly cross-sectional regression with the option se to report the standard errors. Then we retained just one observation per year and deleted duplicates. The results are the same as reported by the option first in the fmb regression, with the only difference that we have now additional regression statistics.


  • 4

Quick Table for Converting Different Dates to Stata Format

Category:Uncategorized

Daily Dates

Copying data from the internet, CSV files, or other sources into Stata will record the date as a string variable, shown with red color. Before we can use the Stata time-series or panel-data capabilities, we need to convert the string date to a Stata date. In the following table, the first column shows different date formats in which the date is already recorded and brought into Stata. To convert them into a Stata date, an example code is shown in the second column. Once the date is converted into a Stata readable format, we need to format the date so that the visual display of the date is human-readable. We can do that by using the %td format, for example, we can use the code format mydate %td

text Code Output
15/1/2015
gen mydate=date(text, "MDY")
15feb2015
15/1/2015
gen mydate=date(text, "MDY")
15feb2015
2015/1/15
gen mydate=date(text, "YMD")
15feb2015
201502
gen mydate=date(text, "MY")
1feb2015
1/15/08
gen mydate=date(text,"MDY",1999)
15jan1908
1/15/08
gen mydate=date(text,"MDY",2019)
15jan2008
1/15/51
gen mydate=date(text,"MDY",2000)
15jan1951
1/15/01
gen mydate=date(text,"MDY",2050)
15jan2001
1/15/00
gen mydate=date(text,"MDY",2050)
15jan2000
20060125
gen mydate=date(text, "YMD")
25jan2006
060125
gen mydate=date(text, "20YMD")
25jan2006


Example using some data

* Enter example data
clear
input str9 text
"15/1/2015"
end

* Now convert the variable text to Stata date
gen mydate=date(text, "DMY")

* Change the display format
format mydate %td

 

From daily to other frequencies

From daily to Code
weekly
gen weekly_date = wofd(daily_date)
Monthly
gen monthly_date = mofd(daily_date)
Quarterly
gen qyarterly_date = qofd(daily_date)
Yearly
gen year = year(daily_date)

 

 

Example using some data

* Enter example data
clear
input str9 text
"15/1/2015"
end

* Now convert the variable text to Stata date
gen daily_date=date(text, "DMY")
format daily_date %td


* Create a weekly date
gen weekly_date = wofd(daily_date)
format weekly_date %tw

* Create a monthly date
gen monthly_date = mofd(daily_date)
format monthly_date %tm

* Create a quarterly date
gen quarterly_date = qofd(daily_date)
format quarterly_date %tq

* Create a yearly date
gen year = year(daily_date)

 

 

From other frequencies to daily

If we already have dates in weekly, monthly, or quarterly frequencies, we can convert them back to daily dates. The second column in the following table provides an example of a given format in which the date is already recorded, and the third column presents the code which shall create a daily date. To see the codes in action, download this do file and execute. The file extension should be changed from doc to do after download. 

From  given_date Code
weekly
2018w46
gen daily_date = dofw(given_date)
Monthly
2018m11
gen daily_date = dofm(given_date)
Quarterly
2018q4
gen daily_date = dofq(given_date)
Yearly
2018
gen daily_date = dofy(given_date)

 

Complex Conversions

If we already have dates in weekly, monthly, or quarterly frequencies, we can convert them back to daily dates and then to other frequencies. The second column in the following table provides an example of a given format in which the date is already recorded, and the third column presents the code which shall convert the date to the desired frequency.  

From  given_date Code
Weekly to monthly
2018w46
gen monthly_date = dofm(dofw(given_date))
Monthly to weekly
2018m11
gen weekly_date = dofw(dofm(given_date))
Quarterly to monthly
2018q4
gen monthly_date = dofm(dofq(given_date))
Monthly to quarterly
2018m11
gen quarterly_date = qofd(dofm(given_date))
Weekly to quarterly
2018w46
gen quarterly_date = qofd(dofw(given_date))
Quarterly to Weekly
2018q4
gen weekly_date = dofw(dofq(given_date))

 

 

  • 6

asdoc version 2 : Summary of New features | export Stata output to MS Word

Category:Uncategorized

Version 2.0 of asdoc is here. This version brings several improvements, adds new features, and fixes minor bugs in the earlier version. Following is the summary of new features and updates.

 

Brief Introduction of asdoc

asdoc sends Stata output to Word / RTF format. asdoc creates high-quality, publication-ready tables from various Stata commands such as summarize, correlate, pwcorr, tab1, tab2, tabulate1, tabulate2, tabstat, ttest, regress, table, amean, proportions, means, and many more. Using asdoc is pretty easy. We need to just add asdoc as a prefix to Stata commands. asdoc has several built-in routines for dedicated calculations and making nicely formatted tables.

 

How to update

The program can be updated by using the following command from Stata command window

ssc install asdoc, replace

 

New Features in Version 2.0

1.  Wide regression tables

This is a new format in which regression tables can be reported. In this format, the variables are shown in columns and one regression is reported per row. Therefore, this type of regressions tables is ideal for portfolios, industries, years, etc. Here is one example of a wide regression table. asdoc allows a significant amount of customization for wide tables including asterisks for showing significance, reporting t-statistics and standard errors either below regression coefficients or sideways, controlling decimal points, reporting additional regression statistics such adjusted R2, RMSE, RSS, etc., adding multiple tables in the same file, and several other features. Read this post to know more about wide table format.

 

2. Allowing by-group regressions

Version 2.0 of asdoc provides the convenience of estimating regressions over groups and summarizing the regression estimates in nicely formatted tables. This feature follows the Stata default of bysort prefix. This feature works with all three types of regression tables of asdoc that include detailed regression tables, nested tables, and wide tables. In this blog post, I show some examples of by-group regressions.

 

3. Allowing by-group descriptive statistics

Using the bysort prefix with asdoc, we can now find default, detailed, and customized summary statistics over groups. Details related to this feature will be added later on in a blog post.

 

4. Option label with tabulate and regress commands

Option label can now be used with regression and tabulation commands. Using this option, asdoc will report variable labels instead of variable names. In case variable labels are empty, then the variable names are reported.

 

5. Developing tables row by row using option row

Option row is a new feature in version 2. Option row allows building a table row by row from text and statistics. In each run of asdoc with option row, a row is added to the output table. This is a useful feature when statistics are collected from different Stata commands to build customized tables. To know more about this option, read this blog post.

 

6.  Accumulate text or numbers with option accum

Option accum allows accumulating text or numbers in a global macro. Once accumulated, the contents of the macro can then be written to an output file using option row.

 

7. Saving files in different folders

One additional feature of version 2.0 is the ability to write new files or append to existing files in different folders.

 


  • 1

Plotting cumulative average abnormal (CAAR) on a graph in Stata

Category:Uncategorized

In Stata, we can use the two-way graph type for plotting abnormal returns or cumulative average abnormal returns against the days of the event window. Suppose that we have event window of 7 days, and have the following data

 

 
days caar1 caar2 caar3 caar4
-3 0 0 0 0
-2 -.0043456 -.0050911 .0000683 .0000504
-1 -.0034961 -.0023533 .0037439 .0042783
0 -.0034278 .0019828 .0090661 .0106628
1 .0016178 .0067894 .0131572 .0156011
2 .0039689 .0104367 .0190594 .0221428
3 .0040022 .0129478 .0218878 .0267722

to plot the first caar1, we shall type :

graph twoway line caar1 days, xline(0)

 

If we were to plot all caars, then

graph twoway line caar* days, xline(0)

 


  • 4

Rolling regressions, beta, t-statistics, and SE in Stata

Category:Uncategorized

asreg can easily estimate rolling regressions, betas, t-statistics and SE in Stata. To understand the syntax and basic use of asreg, you can watch this Youtube video. In this post, I show how to use asreg for reporting standard errors, fitted values, and t-statistics in a rolling window.

To install asreg, type the following on the Stata command window

ssc install asreg

 

Report standard errors and fitted values 

We shall use the grunfeld data set for our examples. Using a rolling window of 15 observations, let us fit a regression model where our dependent variable is invest and independent variables are mvalue and kstock. We shall estimate the rolling regression separately for each company, therefore, we shall use the prefix bys company : 

Please note that option se and fit are used for reporting standard errors and fitted values, respectively.

webuse grunfeld, clear

bys company: asreg invest mvalue kstock, wind(year 15) fit se

 

Find t-statistics in the rolling window

Once we have the standard errors and coefficients, we can generate t-statistics by dividing respective coefficients on their standard errors. Therefore, to find t-values for the variable mvalue and kstock, we can generate new variables:

gen t_mvalue = _b_mvalue / _se_mvalue

gen t_kstock = _b_kstock / _se_kstock



  • 0

Log vs simple returns: Examples and comparisons

Category:Uncategorized Tags : 

Simple vs log returns | Conversion from daily to other frequencies

MS Excel Example 

[Download Example]

In the above table, we have data from 1/1/2010 to 1/7/2010.  The first column had firm id; the second column has dates; the third column has stock prices.

id date prices simple ri log_ri ri+1
1 1/1/2010 70
1 1/2/2010 72 2.857% 2.817% 102.857%
1 1/3/2010 75 4.167% 4.082% 104.167%
1 1/4/2010 73 -2.667% -2.703% 97.333%
1 1/5/2010 74 1.370% 1.361% 101.370%
1 1/6/2010 76 2.703% 2.667% 102.703%
1 1/7/2010 77 1.316% 1.307% 101.316%

The fourth and fifth columns have simple and log returns, calculated as:

simple ri = (Price[i] - Price[i-1] ) /  Price[i-1]  --- (Eq. 1)
log ri = ln( Price[i]  /  Price[i-1]  --- (Eq. 2)

where Price[i] is the stock price in the current period,  Price[i-1] is the stock price in the previous period, ln is the natural log. To convert simple returns to n-period cumulative returns, we can use the products of the terms (1 + ri) up to period n. Therefore, the fifth column adds a value of 1 to the simple period returns.

Weekly cumulative simple returns

Suppose we wish to find weekly cumulative simple returns from the stock prices, we shall just use the first and the last stock prices of the week and apply equation (1). Therefore, our cumulative weekly simple return is as follows:

weekly simple ri = ( 77 - 70) /  70 = 10.00%

And if we were to find weekly cumulative simple returns from the daily returns, then we would add 1 to each of the period simple_ri, find its product, and deduct 1 at the end. Therefore, the formula for converting simple periodic daily returns to weekly cumulative returns would be :

Cumulative n-period simple returns =

(1+simple_r1) * (1+simple_r2) *(1+simple_r3)
  ... (1+simple_rn- 1     --- (Eq. 3)

Therefore, applying Equation 3 to our example;

Cumulative weekly simple returns =
102.857% * 104.167% * 97.333% * 101.370% * 102.703% 
* 101.316% - 1 = 10.00%

Weekly cumulative log returns

Now suppose we wish to find weekly cumulative log returns from the stock prices, again we shall use the first and the last of the stock prices of the week in equation (2). So, our cumulative weekly log return is as follows:

weekly log ri = ln( 77 /  70) = 9.53%

Since log returns are continuously compounded returns, it is normal to see that the log returns are lower than simple returns. To find n-period log returns from daily log returns, we need to just sum up the daily log returns. Therefore :

Cumulative weekly simple returns2.817% + 4.082% +  (-2.703%) + 
1.361% +2.667% +  1.307% = 9.53%

Stata Example


We shall continue to use the same data as above. The Stata do file for all of the following steps can be downloaded from here.

The following lines of code will generate the required data

clear
input float date byte(id prices) float wofd
18263 1 70 2600
18264 1 72 2600
18265 1 75 2600
18266 1 73 2600
18267 1 74 2600
18268 1 76 2600
18269 1 77 2600
end
format %td date
format %tw wofd
tsset id date

Now to generate simple and log returns

bys id (date) : gen simple_ri = (price / L.price) -1
bys id (date) : gen log_ri = ln(price / L.price)

Cumulative weekly simple returns

we shall use ascol program. This program can be downloaded from SSC by typing:

ssc install ascol

If daily returns were calculated with Eq. 1 above (i.e. simple returns) and they needed to be converted to cumulative n-periods returns, we shall use the option returns(simple). For this purpose, we would type the following command:

 
ascol simple_ri, returns(simple) keep(all) toweek

For syntax and option details of ascol, you can type help ascol at the Stata command prompt. We shall just briefly list the option used in the above command. After typing ascol, we need to mention the name of the variable for which cumulative returns are needed.

In our case, it is the simple_ri. Then after the comma, we invoke various program options. Our first option is returns(simple), which tells ascol that our data have simple returns. ascol will apply product method of converting from daily to weekly see Eq. 3 above ).  Then we use keep(all)  to stop ascol from collapsing the data set to a weekly frequency. Absent this option, the data will be reduced to one observation per ID and weekly_period identifier.  The other possibility in this regard is the option price, which can be used if the variable is stock prices.  And finally, we used toweek option for converting the data to a weekly frequency. Other possible options in this regard are tomonth, toquarter, and toyear.

Cumulative weekly log returns

If daily returns were calculated using Eq. 2 above  (i.e. log returns) and they need to be converted to cumulative n-periods returns, we shall use the option returns(log). For this purpose, we would type the following command:

 ascol log_ri , returns(log) keep(all) toweek gen(log_cumRi)

The syntax details remain the same as given above. We have used one additional option gen(log_cum) for naming the new variable as log_cumRi

date wofd simple_ri log_ri week_s~i log_cumRi
01jan2010 2010w1 . . .1 .09531018
02jan2010 2010w1 .0285714 .0281709 .1 .09531018
03jan2010 2010w1 .0416667 .040822 .1 .09531018
04jan2010 2010w1 -.0266667 -.0270287 .1 .09531018
05jan2010 2010w1 .0136986 .0136057 .1 .09531018
06jan2010 2010w1 .027027 .0266682 .1 .09531018
07jan2010 2010w1 .0131579 .0130721 .1 .09531018

  • 0

Stata Dates: Conversion from one format to another

Category:Uncategorized

Case 1: From String to Stata format

Usually, when we import data manually into the Stata Editor, the dates are shown in string format. For example, Nov202011, November202011, or  etc. We can use the gen command with date function

gen newdate = date(oldDate, "MDY")

 

Case 2: From daily to monthly

gen monthly = mofd(daily_date)

 

Case 3: From daily to weekly

gen monthly = wofd(daily_date)

Case 4: From daily to quarterly

gen monthly = qofd(daily_date)

 

Case 5: From daily to yearly

gen monthly = year(daily_date)

 

Case 6: From monthly to daily

If our date is recorded in monthly numeric format such as 2001m1, 2001m2, etc, then:

gen daily = dofm(monthly_date)

  • 0

Research Topics in Islamic Banking and Finance

Category:Uncategorized

 


 How Islamic financial instruments can be used in international trade?

 A mechanism for inter-bank transactions for Islamic and conventional banks

Can Sharia board play a role in the development of Islamic instruments?

4 Tawarruq as a tool of inter-bank borrowing

5  Risk management framework for Islamic banks: do we need something special?

6  Have the challenges faced by Islamic banks changed over the last decade?

7  The dynamics of financial crisis: Conventional vs Islamic finance

8  Can Zakat be used as a microfinancing tools?

9  Value at Risk of Sukuk and conventional bonds

10  Risk analysis of Murabaha financing and leasing

11  What customers say about Islamic banking? Values vs religious perspectives

12  Can ownership structure affect earning management?

13 Collaborative Islamic Banking Service: The Case of Ijarah

14 Success factors of collaboration in Islamic banks

15 Constraints in the application of partnerships in Islamic banks

16  Can Islamic finance reduce nonperforming loans?

17  Which firms use Islamic financing?

18  Can SME’s benefit more from Islamic financing?

19  Islamic banking development and access to credit

20  Islamic finance and economic growth