diff --git a/CNAME b/CNAME
new file mode 100644
index 0000000000..534894d7de
--- /dev/null
+++ b/CNAME
@@ -0,0 +1 @@
+www.maricarmen-a-leiva.com
\ No newline at end of file
diff --git a/_config.yml b/_config.yml
index 29ca0e6bcc..5420843dc8 100755
--- a/_config.yml
+++ b/_config.yml
@@ -1,22 +1,22 @@
-title: Hello, world! I'm David Freeman
+title: Hello, world! I'm Maricarmen A Leiva
description: > # this means to ignore newlines until "baseurl:"
Write an awesome description for your new site here. You can edit this
line in _config.yml. It will appear in your document head meta (for
Google search results) and in your feed.xml site description.
permalink: ':title/'
-baseurl: "/flexible-jekyll" # the subpath of your site, e.g. /blog
-url: "" # the base hostname & protocol for your site, e.g. http://example.com
+baseurl: # the subpath of your site, e.g. /blog
+url: "http://www.maricarmen-a-leiva.com" # the base hostname & protocol for your site, e.g. http://example.com
site-twitter: #if your site has a twitter account, enter it here
# Author Settings
-author: David Freeman # add your name
-author-img: david-freeman.jpg # add your photo
-about-author: I am a web developer focusing on front-end development. Always hungry to keep learning. # add description
-social-twitter: # add your Twitter handle
-social-facebook: # add your Facebook handle
-social-github: artemsheludko # add your Github handle
-social-linkedin: # add your Linkedin handle
-social-email: # add your Email address
+author: Maricarmen Arenas Leiva # add your name
+author-img: me3.png # add your photo
+about-author: My name is Maricarmen and I'm passionate about data analysis and research (Business, Economics, Finances, etc). I'm always learning something new!
+#social-twitter: # add your Twitter handle
+#social-facebook: # add your Facebook handle
+social-github: butterfly008 # add your Github handle
+social-linkedin: maricarmen-a-l # add your Linkedin handle
+social-email: mari.arenas.leiva@gmail.com # add your Email address
# Disqus
discus-identifier: mr-brown # add your discus identifier
diff --git a/_posts/2015-12-10-stat-analysis.md b/_posts/2015-12-10-stat-analysis.md
new file mode 100644
index 0000000000..8b9dd4497e
--- /dev/null
+++ b/_posts/2015-12-10-stat-analysis.md
@@ -0,0 +1,32 @@
+---
+layout: post
+title: "Healthcare Study"
+date: 2017-09-12 13:32:20 +0300
+description: # Add post description (optional)
+img: health.jpg # Add image post (optional)
+fig-caption: # Add figcaption (optional)
+tags: [stats, healthcare, econometrics]
+---
+
+This was my synthesis project done during my final year at UQAM (during my Bachelors in Economics). The goal of this project was to do synthesis of economics concepts. I decided do wprk micro economics project related to health. The dataset was provided by our professor. Here I use econometrics (statistical analysis) techniques. The report was in French, but here below I translated the conclusion-summary. In the report you can see Graphs and diagrams that can give you an idea of the whole analysis.
+
+
+
+### Summary-Conclusion from the Study:
+
+To begin with, the reform introduced in Germany in mid-1997 did have an effect on the number of visits to the doctor. Although the effect is not "revolutionary", it is statistically significant (in the short term). The effect of the reform is most noticeable in 1998, since this is the only time the reform was present in the entire year. However, the effect is still visible in 1999, which tells us that people have probably become accustomed to visiting the doctor less. As shown in Table 1 of our descriptive statistics, we go from an average of 2.66 for 1996 to 2.35 for 1998. Furthermore, by running multiple regressions, we come to the conclusion that the impact of the reform on the population varies on average by -0.25 in our model including all our control variables, for the year 1998. This figure decreases to -017 in our fixed-effects model, and in our opinion this is the most plausible figure. This represents 1/13 of the average number of visits to the doctor for all years combined, i.e. 8% less.
+These effects appear to be small-scale, but as we have seen, they are statistically significant. What's more, the reform has been applied to many other healthcare sectors, which means that the effect of co-payments at the doctor's surgery must be added to other effects on the healthcare system, which we have not analyzed here. So, co-payments are a way of helping to reduce healthcare system costs. It would be interesting to see the magnitude of the effect in concrete dollars, but that's beyond the scope of our analysis.
+We also note that the reform reduced moral hazard, as people changed their behavior when co-payments were introduced (except for the chronically and severely ill). This is an important point, because it tells us that co-payments have reduced unnecessary visits to the doctor. It also suggests that the government should make the population aware of the need to visit the doctor responsibly.
+The reform seems to have had less impact on people with serious or chronic illnesses. In fact, this is the conclusion reached by Winkelmann and other authors in our literature review, due to their inelastic demand. Table 2 in our descriptive statistics testifies to this, although here we're only talking about the individual's perception of health, we have in fact omitted variables relating to health. It would be important to take these people into account during a reform, as a reduction in visits to the doctor on their part could exacerbate their condition, given that in their case the doctor is a basic need.
+Finally, we also analyzed the different groups we saw in the descriptive statistics. We can see that, even though women go to the doctor more than men on average, the reform has had almost the same effect on both sexes. As for the groups with different levels of education, the effect of the reform seems to be the same for all of them too. The effect of the reform seems to be relatively great when compared with employed people, but this seems to be due to a trend that was there a priori.
+Finally, we believe that the reform of co-payments could have a positive effect on Quebec, as it would reduce the moral hazard. However, given the differences between Germany and Quebec, we would need to introduce a co-payment system that is appropriate for the population and that does not punish the most disadvantaged (the unemployed).
+
+
+
+
+
+
+
+
+![synthese healthcare]({{site.baseurl}}/assets/img/Synthesis-Activity-healthcare.pdf)
+![Macbookq]({{site.baseurl}}/assets/img/Synthesis-Activity-healthcare.pdf)
\ No newline at end of file
diff --git a/_posts/2016-04-21-Financial-market-project.md b/_posts/2016-04-21-Financial-market-project.md
new file mode 100644
index 0000000000..52a5ee8ddf
--- /dev/null
+++ b/_posts/2016-04-21-Financial-market-project.md
@@ -0,0 +1,336 @@
+---
+layout: post
+title: Financial Market GMM analysis
+date: 2023-07-20 13:32:20 +0300
+description: Project done during my Financial Econometric class - Here I showcase the STATA code and the PDF file ( can be downloaded). The project is written in French and will eventually be translated.
+ # highlight: WIP # Add post description (optional)
+#url:
+img: financial-stocks.jpg # Add image post (optional)
+fig-caption: Project done during my Financial Econometric class - Here I showcase the STATA code and the PDF file ( can be downloaded). The project is written in French and will eventually be translated.
+tags: [finances, GMM, Econometrics, stock market]
+---
+
+Project done during my Financial Econometric class - Here I showcase the STATA code and the PDF file ( can be downloaded). The project is written in French and will eventually be translated.
+
+
+This project is based on the Research paper written by Jeffrey S. Jones et Brian Kincaid named "Can the correlation among Dow 30 stocks predict market declines?". I did this research as part of my Economics bachelors, it's a master's degree class so I got special permission to take it. It was the hardest class I have taken so far!
+
+
+In this research I used OLC, MCO and GMM regressions.
+
+### Conclusion of my research
+
+In conclusion, I'd like to point out that several articles mention that an increase in correlations between assets is linked to financial crises. Articles [^1], [^2] and [^3] in my bibliography mention this and cite other studies which also mention it. Financial crises are a scourge for the economy and it would be interesting to find a way of preventing them in order to limit the damage. This method would be a useful tool for legislators and portfolio managers alike, and would benefit investors and the general public alike. Correlations are a relatively simple way of predicting market declines. It would therefore be of great importance to be able to retest Jones and Kincaid's (2014) method of regressing retained returns on rolling correlations between assets. In fact, according to the "Econpapers" site [^10], Jones and Kincaid's study is the first to attempt to predict market returns (S&P 500) using correlations between Dow 30 assets (with historical data). So, despite the fact that my work has not proven that correlations can predict a market decline, I firmly believe that the study is valid and should be repeated.
+
+Pplease take a look at the PDF for more detail!
+
+{: data-content="footnotes"}
+
+[^1]: Winkelmann, Rainer. “Health care reform and the number of doctor’s visits- an econometric analysis .” Journal of applied Economics 19: 455-472 (2004). IDEAS. 7 Novembre 2015. Web. http://onlinelibrary.wiley.com.proxy.bibliotheques.uqam.ca:2048/doi/10.1002/jae.764/abstract
+[^2]: Gerfin, Michael; Schellhorn, Martin. “Nonparametric bounds on the effect of deductibles in health care insurance on doctor visits- Swiss evidence”. Healh Economics. 15:1011-1020 (2006).PubMed. Web. 7 November 2015.
+
+[^3]: Ziebarth, Nicolas R. “Assessing the effectiveness of health care cost containment measures: evidence from the market for rehabilitation care”. International Journal of Health Care Finances and Economics. 14:41-67(2014). PubMed.Web. 7 Novembre 2015.
+
+[^10]: BALAN Marius,Traoré San Nouhoun ‘’La reforme des prix des medicaments géneriques et les economies de couts des regimes privés ”. (Novembre 2012 ). 4 Décembre 2015 < http://www.conseiller.ca/files/2012/10/8-generiques_1112.pdf>
+C:\Users\maria\Documents\website\flexible-jekyll\assets\img\automation.jpg
+
+
+## Code Stata
+Here below is the code I programmed on STATA (The software used for statistical analysis)
+
+``` C++
+
+/////////////////////////Eco8620
+
+
+clear
+global root ="C:\Users\Maricarmen\Desktop\TRAVAIL SESSION 8620\travail de session\DATA\01-do-file"
+global raw = "C:\Users\Maricarmen\Desktop\TRAVAIL SESSION 8620\travail de session\DATA\02 raw file"
+global work = "C:\Users\Maricarmen\Desktop\TRAVAIL SESSION 8620\travail de session\DATA\03-work"
+use "$raw\ppeco8620MaricarmenArenas.dta" , clear
+
+
+//time series (avant de faire mes tableaux et regressions il a fallu transformer mes données en time series)
+
+/*gen temps2 = date(temps, "MDY")
+
+ //dummy variables pour chaque décénnie
+
+format temps2 %td
+
+tsset temps2
+
+gen d1990=0
+gen d2000=0
+gen d2010=0
+replace d1990=1 if tin(01mar1990,31dec1999)
+
+replace d2000=1 if tin(01jan2000,31dec2009)
+
+replace d2010=1 if tin(01jan2010,31mar2016)*/
+
+
+//scatter snp3_6 temps2 if d2000 , connect(2) clwidth(medthick) clcolor(black) clpattern(dot) || scatter corrmed6 temps2 if tin(01jan2000,01dec2009) , connect(2) clwidth(medthick) clcolor(black) clpattern(dot) mps2 if tin(01jan2000,01dec2009) , connect(2) clwidth(medthick) clcolor(black) clpattern(dot)
+
+// graphiques/figures -sP500 et corrélations à travers le temps
+
+ scatter SP500 temps2 if d1990, connect(2) clwidth(medthick) clcolor(black) clpattern(dot) c(l) yaxis(1)||scatter corrmed18 temps2 if d1990 , connect(2) clwidth(medthick) clcolor(black) clpattern(dot) c(l) yaxis(2) title("Décénnies 1990: corrélations vs rendements S&P500 ")
+
+ scatter SP500 temps2 if d2000 , connect(2) clwidth(medthick) clcolor(black) clpattern(dot) c(l) yaxis(1) ||scatter corrmed18 temps2 if d2000 , connect(2) clwidth(medthick) clcolor(black) clpattern(dot) c(l) yaxis(2)title("Décénnies 2000: corrélations vs rendements S&P500 ")
+
+ scatter SP500 temps2 if d2010, connect(2) clwidth(medthick) clcolor(black) clpattern(dot) c(l) yaxis(1) || scatter corrmed18 temps2 if d2010, connect(2) clwidth(medthick) clcolor(black) clpattern(dot) c(l) yaxis(2) title("Décénnies 2010: corrélations vs rendements S&P500 ")
+
+
+//tableau pour les moyennes, medianes, écarts types, et quantiles
+
+tabstat snp3_6 snp6_6 snp9_6 snp12_12 snp3_12 snp6_12 snp9_12 snp12_12 snp3_18 snp6_18 snp9_18 snp12_18 corrmed6 corrmed12 corrmed18 corpbondyield GDP_growth unemp_rate, stats(mean p1 p5 p10 p25 p50 p75 p90 p95 p99 sd) columns(statistics)
+
+
+// regressions simples
+
+//12 regressions
+
+
+reg snp3_6 corrmed6 GDP_growth unemp_rate corpbondyield, vce(robust)
+outreg2 using Tp.xls, replace ctitle(Regression)
+
+reg snp6_6 corrmed6 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp9_6 corrmed6 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp12_6 corrmed6 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp3_12 corrmed12 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp6_12 corrmed12 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp9_12 corrmed12 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp12_12 corrmed12 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp3_18 corrmed18 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp6_18 corrmed18 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp9_18 corrmed18 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+reg snp12_18 corrmed18 corpbondyield GDP_growth unemp_rate , vce(robust)
+outreg2 using Tp.xls, append ctitle(Regression)
+
+// 36 regressions
+
+// regressions par décénnie dummie variables 1990-
+
+reg snp3_6 corrmed6 unemp_rate corpbondyield GDP_growth if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, replace ctitle(Regression)
+
+reg snp6_6 corrmed6 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp9_6 corrmed6 corpbondyield GDP_growth unemp_rate if d1990 , vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp12_6 corrmed6 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp3_12 corrmed12 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp6_12 corrmed12 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp9_12 corrmed12 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp12_12 corrmed12 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp3_18 corrmed18 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp6_18 corrmed18 corpbondyield GDP_growth unemp_rate if d1990 , vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp9_18 corrmed18 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+
+reg snp12_18 corrmed18 corpbondyield GDP_growth unemp_rate if d1990, vce(robust)
+outreg2 using ppMaricarmen.xls, append ctitle(Regression)
+//2000--------------------------------------------------------------------------------
+
+reg snp3_6 corrmed6 unemp_rate corpbondyield if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, replace ctitle(Regression)
+
+reg snp6_6 corrmed6 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp9_6 corrmed6 corpbondyield GDP_growth unemp_rate if d2000 , vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp12_6 corrmed6 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp3_12 corrmed12 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp6_12 corrmed12 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp9_12 corrmed12 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp12_12 corrmed12 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp3_18 corrmed18 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp6_18 corrmed18 corpbondyield GDP_growth unemp_rate if d2000 , vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp9_18 corrmed18 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+reg snp12_18 corrmed18 corpbondyield GDP_growth unemp_rate if d2000, vce(robust)
+outreg2 using ppMaricarmen2.xls, append ctitle(Regression)
+
+//2010-
+
+reg snp3_6 corrmed6 unemp_rate corpbondyield GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, replace ctitle(Regression)
+
+reg snp6_6 corrmed6 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp9_6 corrmed6 corpbondyield unemp_rate GDP_growth if d2010 , vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp12_6 corrmed6 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp3_12 corrmed12 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp6_12 corrmed12 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp9_12 corrmed12 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp12_12 corrmed12 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp3_18 corrmed18 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp6_18 corrmed18 corpbondyield unemp_rate GDP_growth if d2010 , vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp9_18 corrmed18 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+reg snp12_18 corrmed18 corpbondyield unemp_rate GDP_growth if d2010, vce(robust)
+outreg2 using ppMaricarmen3.xls, append ctitle(Regression)
+
+
+//corrélations pour GMM
+
+
+pwcorr snp12_6 snp6_6 snp3_6 snp9_6 corrmed6
+pwcorr x1 x2 x3 x4 x5 x6 corrmed6
+
+pwcorr x1 x2 x3 x4 x5 x6 snp3_6 snp3_6 snp6_6 snp9_6 snp12_6
+
+pwcorr snp3_12 snp6_12 snp9_12 snp12_12 corrmed12
+
+pwcorr x7 x8 x9 x10 x11 x12 corrmed12
+
+pwcorr x7 x8 x9 x10 x11 x12 snp3_12 snp3_12 snp6_12 snp9_6 snp12_12
+
+
+pwcorr snp3_18 snp6_18 snp9_18 snp12_18 corrmed18
+pwcorr x13 x14 x15 x16 x17 x18 corrmed18
+
+pwcorr x13 x14 x15 x16 x17 x18 snp3_18 snp6_18 snp9_18 snp12_18
+
+
+//GMM
+// 12 regressions
+
+
+sca b0 = 1
+gmm (snp3_6 - {xb:corrmed6 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x1 x2 x3 x4 x5 x6) twostep vce(unadjusted)
+
+sca b0 = 1
+gmm (snp6_6 - {xb:corrmed6 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x1 x2 x3 x4 x5 x6) twostep vce(unadjusted)
+
+
+sca b0 = 1
+gmm (snp9_6 - {xb:corrmed6 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x1 x2 x3 x4 x5 x6) twostep vce(unadjusted)
+
+
+sca b0 = 1
+gmm (snp12_6 - {xb:corrmed6 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x1 x2 x3 x4 x5 x6) twostep vce(unadjusted)
+
+
+//---------------------------------------------------------------------------------------------------------------------
+
+
+
+sca b0 = 1
+gmm (snp3_12 - {xb:corrmed12 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x7 x8 x9 x10 x11 x12) twostep vce(unadjusted)
+
+sca b0 = 1
+gmm (snp6_12 - {xb:corrmed12 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x7 x8 x9 x10 x11 x12) twostep vce(unadjusted)
+
+
+sca b0 = 1
+gmm (snp9_12 - {xb:corrmed12 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x7 x8 x9 x10 x11 x12) twostep vce(unadjusted)
+
+
+
+sca b0 = 1
+gmm (snp12_12 - {xb:corrmed12 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x7 x8 x9 x10 x11 x12) twostep vce(unadjusted)
+
+
+
+//------------------------------------------------------------------------------------------------------------------------------------
+
+
+
+sca b0 = 1
+gmm (snp3_18 - {xb:corrmed18 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x13 x14 x15 x16 x17 x18) twostep vce(unadjusted)
+
+sca b0 = 1
+gmm (snp6_18 - {xb:corrmed18 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x13 x14 x15 x16 x17 x18) twostep vce(unadjusted)
+
+
+sca b0 = 1
+gmm (snp9_18 - {xb:corrmed18 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x13 x14 x15 x16 x17 x18) twostep vce(unadjusted)
+
+
+sca b0 = 1
+gmm (snp12_6 - {xb:corrmed18 unemp_rate corpbondyield GDP_growth} - {b0} ), instruments (x13 x14 x15 x16 x17 x18) twostep vce(unadjusted)
+
+
+//%%=================================================================================================================================================
+
+
+return list
+ereturn list
+
+mat resultat = r(table)
+sca b1 = resultat[1,1]
+dis b1
+
+```
\ No newline at end of file
diff --git a/_posts/2016-05-05-directed-studies.md b/_posts/2016-05-05-directed-studies.md
new file mode 100644
index 0000000000..ba05e158b2
--- /dev/null
+++ b/_posts/2016-05-05-directed-studies.md
@@ -0,0 +1,21 @@
+---
+layout: post
+title: Behavioural economics applied on retirement saving policies
+date: 2023-07-20 13:32:20 +0300
+description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
+#url:
+img: retirement.png # Add image post (optional)
+fig-caption: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes.
+tags: [Retirement, savings, studies]
+---
+
+When completing my degree at UQAM university (2013-2016) I took a directed readings class on behavioural economics applied on retirement saving policies. This was an independant class, meaning that I took the initiative to reach a professor and I was the only student working on that project. The report is in French, but here below I leave the translated introduction.
+
+
+### Intro-summary
+
+In this text, we are going to study the application of behavioral economics to the study of policies targeting retirement savings. Retirement savings has become a significant issue, on one hand, we have an aging population, therefore a future increase in the tax burden that taxpayers will have to pay. On the other hand, we have financial products that are becoming more and more sophisticated, and which require increased financial literacy. In Quebec, as everywhere in Canada, the subject is topical. The following document, which is divided into four sections, will analyze various academic economics articles that focus on different aspects of the question. First, we try to quantify the savings needed for savers to have a good retirement, then we focus on the situation in Canada. We then try to explain the behavioral biases that we have discovered thanks to psychological, economic theories and empirical data. We look at the inconsistency over time, which is at the origin of procrastination and inertia, as well as loss aversion and mental accounting. We end up talking about public policies that take these biases into account and that could or have already been implemented. Then, we compare with the retirement plans of Quebec such as the TFSA, the VRSP and the RRSP.
+
+
+
+
\ No newline at end of file
diff --git a/_posts/2017-04-06-how-i-rest-from-work.markdown b/_posts/2017-04-06-how-i-rest-from-work.markdown
deleted file mode 100644
index 0448318914..0000000000
--- a/_posts/2017-04-06-how-i-rest-from-work.markdown
+++ /dev/null
@@ -1,30 +0,0 @@
----
-layout: post
-title: How I Rest From Work
-date: 2017-09-12 13:32:20 +0300
-description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
-img: i-rest.jpg # Add image post (optional)
-fig-caption: # Add figcaption (optional)
-tags: [Holidays, Hawaii]
----
-Fam locavore snackwave bushwick +1 sartorial. Selfies portland knausgaard synth. Pop-up art party marfa deep v pitchfork subway tile 3 wolf moon. Ennui pinterest tumblr yr, adaptogen succulents copper mug twee. Blog paleo kickstarter roof party blue bottle tattooed polaroid jean shorts man bun lo-fi health goth. Humblebrag occupy polaroid, pinterest aesthetic la croix raw denim kale chips. 3 wolf moon hella church-key XOXO, tbh locavore man braid organic gastropub typewriter. Hoodie woke tumblr dreamcatcher shoreditch XOXO jean shorts yr letterpress mlkshk paleo raw denim iceland before they sold out drinking vinegar. Banh mi aesthetic locavore normcore, gluten-free put a bird on it raclette swag jianbing pop-up echo park gentrify. Stumptown brooklyn godard tumeric ethical. Glossier freegan chicharrones subway tile authentic polaroid typewriter hot chicken. Thundercats small batch heirloom meggings.
-
-## Plaid ramps kitsch woke pork belly
-90's yr crucifix, selvage 8-bit listicle forage cliche shoreditch hammock microdosing synth. Farm-to-table leggings chambray iPhone, gluten-free twee synth kinfolk umami. Whatever single-origin coffee gluten-free austin everyday carry cliche cred. Plaid ramps kitsch woke pork belly organic. Trust fund whatever coloring book kombucha brooklyn. Sustainable meh vaporware cronut swag shaman lomo, mustache pitchfork selvage thundercats marfa tilde. Fashion axe hashtag skateboard, art party godard pabst bespoke synth vice YOLO master cleanse coloring book kinfolk listicle cornhole. Try-hard mixtape umami fanny pack man bun gastropub franzen tbh. Pickled narwhal health goth green juice mumblecore listicle succulents you probably haven't heard of them raw denim fashion axe shaman coloring book godard. Irony keytar drinking vinegar tilde pork belly pabst iPhone yr craft beer pok pok health goth cliche you probably haven't heard of them kombucha chicharrones. Direct trade hella roof party chia. Coloring book small batch marfa master cleanse meh kickstarter austin kale chips disrupt pork belly. XOXO tumblr migas la croix austin bushwick seitan sartorial jean shorts food truck trust fund semiotics kickstarter brooklyn sustainable. Umami knausgaard mixtape marfa. Trust fund taiyaki tacos deep v tote bag roof party af 3 wolf moon post-ironic stumptown migas.
-
-![I and My friends]({{site.baseurl}}/assets/img/we-in-rest.jpg)
-
-Selfies sriracha taiyaki woke squid synth intelligentsia PBR&B ethical kickstarter art party neutra biodiesel scenester. Health goth kogi VHS fashion axe glossier disrupt, vegan quinoa. Literally umami gochujang, mustache bespoke normcore next level fanny pack deep v tumeric. Shaman vegan affogato chambray. Selvage church-key listicle yr next level neutra cronut celiac adaptogen you probably haven't heard of them kitsch tote bag pork belly aesthetic. Succulents wolf stumptown art party poutine. Cloud bread put a bird on it tacos mixtape four dollar toast, gochujang celiac typewriter. Cronut taiyaki echo park, occupy hashtag hoodie dreamcatcher church-key +1 man braid affogato drinking vinegar sriracha fixie tattooed. Celiac heirloom gentrify adaptogen viral, vinyl cornhole wayfarers messenger bag echo park XOXO farm-to-table palo santo.
-
->Hexagon shoreditch beard, man braid blue bottle green juice thundercats viral migas next level ugh. Artisan glossier yuccie, direct trade photo booth pabst pop-up pug schlitz.
-
-Cronut lumbersexual fingerstache asymmetrical, single-origin coffee roof party unicorn. Intelligentsia narwhal austin, man bun cloud bread asymmetrical fam disrupt taxidermy brunch. Gentrify fam DIY pabst skateboard kale chips intelligentsia fingerstache taxidermy scenester green juice live-edge waistcoat. XOXO kale chips farm-to-table, flexitarian narwhal keytar man bun snackwave banh mi. Semiotics pickled taiyaki cliche cold-pressed. Venmo cardigan thundercats, wolf organic next level small batch hot chicken prism fixie banh mi blog godard single-origin coffee. Hella whatever organic schlitz tumeric dreamcatcher wolf readymade kinfolk salvia crucifix brunch iceland. Literally meditation four loko trust fund. Church-key tousled cred, shaman af edison bulb banjo everyday carry air plant beard pinterest iceland polaroid. Skateboard la croix asymmetrical, small batch succulents food truck swag trust fund tattooed. Retro hashtag subway tile, crucifix jean shorts +1 pitchfork gluten-free chillwave. Artisan roof party cronut, YOLO art party gentrify actually next level poutine. Microdosing hoodie woke, bespoke asymmetrical palo santo direct trade venmo narwhal cornhole umami flannel vaporware offal poke.
-
-* Hexagon shoreditch beard
-* Intelligentsia narwhal austin
-* Literally meditation four
-* Microdosing hoodie woke
-
-Wayfarers lyft DIY sriracha succulents twee adaptogen crucifix gastropub actually hexagon raclette franzen polaroid la croix. Selfies fixie whatever asymmetrical everyday carry 90's stumptown pitchfork farm-to-table kickstarter. Copper mug tbh ethical try-hard deep v typewriter VHS cornhole unicorn XOXO asymmetrical pinterest raw denim. Skateboard small batch man bun polaroid neutra. Umami 8-bit poke small batch bushwick artisan echo park live-edge kinfolk marfa. Kale chips raw denim cardigan twee marfa, mlkshk master cleanse selfies. Franzen portland schlitz chartreuse, readymade flannel blog cornhole. Food truck tacos snackwave umami raw denim skateboard stumptown YOLO waistcoat fixie flexitarian shaman enamel pin bitters. Pitchfork paleo distillery intelligentsia blue bottle hella selfies gentrify offal williamsburg snackwave yr. Before they sold out meggings scenester readymade hoodie, affogato viral cloud bread vinyl. Thundercats man bun sriracha, neutra swag knausgaard jean shorts. Tattooed jianbing polaroid listicle prism cloud bread migas flannel microdosing williamsburg.
-
-Echo park try-hard irony tbh vegan pok pok. Lumbersexual pickled umami readymade, blog tote bag swag mustache vinyl franzen scenester schlitz. Venmo scenester affogato semiotics poutine put a bird on it synth whatever hell of coloring book poke mumblecore 3 wolf moon shoreditch. Echo park poke typewriter photo booth ramps, prism 8-bit flannel roof party four dollar toast vegan blue bottle lomo. Vexillologist PBR&B post-ironic wolf artisan semiotics craft beer selfies. Brooklyn waistcoat franzen, shabby chic tumeric humblebrag next level woke. Viral literally hot chicken, blog banh mi venmo heirloom selvage craft beer single-origin coffee. Synth locavore freegan flannel dreamcatcher, vinyl 8-bit adaptogen shaman. Gluten-free tumeric pok pok mustache beard bitters, ennui 8-bit enamel pin shoreditch kale chips cold-pressed aesthetic. Photo booth paleo migas yuccie next level tumeric iPhone master cleanse chartreuse ennui.
diff --git a/_posts/2017-04-06-welcome-to-jekyll.markdown b/_posts/2017-04-06-welcome-to-jekyll.markdown
deleted file mode 100755
index 2add01da6c..0000000000
--- a/_posts/2017-04-06-welcome-to-jekyll.markdown
+++ /dev/null
@@ -1,26 +0,0 @@
----
-layout: post
-title: "Welcome to Jekyll!"
-date: 2017-04-06 13:32:20 +0300
-description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
-img: # Add image post (optional)
----
-You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run `jekyll serve`, which launches a web server and auto-regenerates your site when a file is updated.
-
-To add new posts, simply add a file in the `_posts` directory that follows the convention `YYYY-MM-DD-name-of-post.ext` and includes the necessary front matter. Take a look at the source for this post to get an idea about how it works.
-
-Jekyll also offers powerful support for code snippets:
-
-{% highlight ruby %}
-def print_hi(name)
- puts "Hi, #{name}"
-end
-print_hi('Tom')
-#=> prints 'Hi, Tom' to STDOUT.
-{% endhighlight %}
-
-Check out the [Jekyll docs][jekyll-docs] for more info on how to get the most out of Jekyll. File all bugs/feature requests at [Jekyll’s GitHub repo][jekyll-gh]. If you have questions, you can ask them on [Jekyll Talk][jekyll-talk].
-
-[jekyll-docs]: https://jekyllrb.com/docs/home
-[jekyll-gh]: https://github.com/jekyll/jekyll
-[jekyll-talk]: https://talk.jekyllrb.com/
diff --git a/_posts/2017-09-10-conference-on-javascript.markdown b/_posts/2017-09-10-conference-on-javascript.markdown
deleted file mode 100644
index bdc0815a9b..0000000000
--- a/_posts/2017-09-10-conference-on-javascript.markdown
+++ /dev/null
@@ -1,13 +0,0 @@
----
-layout: post
-title: Conference on Javascript
-date: 2017-09-10 00:00:00 +0300
-description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
-img: js-1.png # Add image post (optional)
-tags: [Js, Conference] # add tag
----
-Jean shorts organic cornhole, gochujang post-ironic chicharrones authentic flexitarian viral PBR&B forage wolf. Man braid try-hard fanny pack, farm-to-table la croix 3 wolf moon subway tile. Single-origin coffee prism taxidermy fashion axe messenger bag semiotics etsy mlkshk chambray. Marfa lumbersexual meditation celiac. Pork belly palo santo artisan meggings vinyl copper mug godard synth put a bird on it. Cloud bread pop-up quinoa, raw denim meditation 8-bit slow-carb. Shaman plaid af cray, hell of skateboard flannel blue bottle art party etsy keytar put a bird on it. Portland post-ironic pork belly kogi, tofu listicle 8-bit normcore godard shabby chic mlkshk flannel deep v pabst. Pork belly kinfolk fingerstache lo-fi raclette. Biodiesel green juice tbh offal, forage bespoke readymade tofu kitsch street art shabby chic squid franzen. Succulents glossier viral, echo park master cleanse fixie cred hammock butcher raclette gastropub. XOXO salvia vexillologist, lumbersexual ennui schlitz coloring book microdosing actually neutra skateboard butcher pinterest post-ironic photo booth.
-
-Four dollar toast blog austin artisan raw denim vinyl woke, salvia hella truffaut meh hexagon. Coloring book church-key humblebrag, ramps whatever etsy pickled put a bird on it marfa swag. Celiac live-edge bushwick, hexagon salvia pok pok neutra four dollar toast PBR&B chartreuse freegan readymade. Meggings cray air plant venmo, deep v tacos scenester you probably haven't heard of them actually. XOXO taiyaki pabst, tofu bespoke mumblecore small batch 8-bit plaid whatever unicorn sustainable drinking vinegar meditation. Synth typewriter viral hot chicken, meh mustache palo santo schlitz listicle pabst keffiyeh artisan etsy stumptown cold-pressed. Occupy locavore cray irony. Chambray whatever vaporware keffiyeh heirloom vice. Single-origin coffee neutra iPhone lyft. Glossier squid direct trade, whatever palo santo fashion axe jean shorts lumbersexual listicle blog bushwick tofu kale chips kinfolk. Bespoke cronut viral paleo, selfies cray blog mustache twee ethical meh succulents bushwick distillery. Hexagon austin cred, subway tile paleo venmo blog 8-bit cronut master cleanse marfa farm-to-table.
-
-Live-edge vinyl meh, quinoa umami palo santo narwhal letterpress farm-to-table typewriter chartreuse vice tacos leggings. Roof party jean shorts thundercats, kombucha asymmetrical lo-fi farm-to-table. Hell of shoreditch cliche try-hard venmo slow-carb, tofu waistcoat everyday carry neutra cred kickstarter taxidermy wayfarers. Direct trade banh mi pug skateboard banjo edison bulb. Intelligentsia cliche quinoa synth umami. Trust fund four loko hoodie paleo cray tote bag slow-carb ennui. Williamsburg food truck intelligentsia trust fund. Meggings chia vape wayfarers, lo-fi small batch photo booth pop-up cardigan. Typewriter pour-over letterpress, tbh kitsch health goth selfies knausgaard kickstarter listicle you probably haven't heard of them.
diff --git a/_posts/2017-09-12-10-tips-to-improve-your-workflow.markdown b/_posts/2017-09-12-10-tips-to-improve-your-workflow.markdown
deleted file mode 100644
index c952e86d83..0000000000
--- a/_posts/2017-09-12-10-tips-to-improve-your-workflow.markdown
+++ /dev/null
@@ -1,28 +0,0 @@
----
-layout: post
-title: 10 Tips To Improve Your Workflow
-date: 2017-09-12 00:00:00 +0300
-description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
-img: workflow.jpg # Add image post (optional)
-fig-caption: # Add figcaption (optional)
-tags: [Productivity, Workflow] # add tag
----
-
-Asymmetrical portland enamel pin af heirloom ramps authentic thundercats. Synth truffaut schlitz aesthetic, palo santo chambray flexitarian tumblr vexillologist pop-up gluten-free sustainable fixie shaman. Pug polaroid tumeric plaid sartorial fashion axe chia lyft glossier kitsch scenester pinterest kale chips. Blog etsy umami fashion axe shoreditch. Prism chambray heirloom, drinking vinegar portland paleo slow-carb. Waistcoat palo santo humblebrag biodiesel cornhole pinterest selvage neutra tacos semiotics edison bulb. Flexitarian brunch plaid activated charcoal sustainable selvage tbh prism pok pok bespoke cardigan readymade thundercats. Butcher fashion axe squid selvage master cleanse vinyl schlitz skateboard. Lomo shaman man bun keffiyeh asymmetrical listicle. Kickstarter trust fund fanny pack post-ironic wayfarers swag kitsch. Shaman pug kale chips meh squid.
-
-### Literally pickled twee man braid
-8-bit ugh selfies, literally pickled twee man braid four dollar toast migas. Slow-carb mustache meggings pok pok. Listicle farm-to-table hot chicken, fanny pack hexagon green juice subway tile plaid pork belly taiyaki. Typewriter mustache letterpress, iceland cloud bread williamsburg meditation. Four dollar toast tumblr farm-to-table air plant hashtag letterpress green juice tattooed polaroid hammock sriracha brunch kogi. Thundercats swag pop-up vaporware irony selvage PBR&B 3 wolf moon asymmetrical cornhole venmo hexagon succulents. Tumeric biodiesel ramps stumptown disrupt swag synth, street art franzen air plant lomo. Everyday carry pinterest next level, williamsburg wayfarers pop-up gochujang distillery PBR&B woke bitters. Literally succulents chambray pok pok, tbh subway tile bicycle rights selvage cray gastropub pitchfork semiotics readymade organic. Vape flexitarian tumblr raclette organic direct trade. Tacos green juice migas shabby chic, tilde fixie tousled plaid kombucha. +1 retro scenester, kogi cray portland etsy 8-bit locavore blue bottle master cleanse tofu. PBR&B adaptogen chartreuse knausgaard palo santo intelligentsia.
-
-![Macbook]({{site.baseurl}}/assets/img/mac.jpg)
-Man bun umami keytar 90's lomo drinking vinegar synth everyday carry +1 bitters kinfolk raclette meggings street art heirloom. Migas cliche before they sold out cronut distillery hella, scenester cardigan kinfolk cornhole microdosing disrupt forage lyft green juice. Tofu deep v food truck live-edge edison bulb vice. Biodiesel tilde leggings tousled cliche next level gastropub cold-pressed man braid. Lyft humblebrag squid viral, vegan chicharrones vice kinfolk. Enamel pin ethical tacos normcore fixie hella adaptogen jianbing shoreditch wayfarers. Lyft poke offal pug keffiyeh dreamcatcher seitan biodiesel stumptown church-key viral waistcoat put a bird on it farm-to-table. Meggings pitchfork master cleanse pickled venmo. Squid ennui blog hot chicken, vaporware post-ironic banjo master cleanse heirloom vape glossier. Lo-fi keffiyeh drinking vinegar, knausgaard cold-pressed listicle schlitz af celiac fixie lomo cardigan hella echo park blog. Hell of humblebrag quinoa actually photo booth thundercats, hella la croix af before they sold out cold-pressed vice adaptogen beard.
-
-### Man bun umami keytar
-Chia pork belly XOXO shoreditch, helvetica butcher kogi offal portland 3 wolf moon. Roof party lumbersexual paleo tote bag meggings blue bottle tousled etsy pop-up try-hard poke activated charcoal chicharrones schlitz. Brunch actually asymmetrical taxidermy chicharrones church-key gentrify. Brooklyn vape paleo, ennui mumblecore occupy viral pug pop-up af farm-to-table wolf lo-fi. Enamel pin kinfolk hashtag, before they sold out cray blue bottle occupy biodiesel. Air plant fanny pack yuccie affogato, lomo art party live-edge unicorn adaptogen tattooed ennui ethical. Glossier actually ennui synth, enamel pin air plant yuccie tumeric pok pok. Ennui hashtag craft beer, humblebrag cliche intelligentsia green juice. Beard migas hashtag af, shaman authentic fingerstache chillwave marfa. Chia paleo farm-to-table, iPhone pickled cloud bread typewriter austin gochujang bitters intelligentsia la croix church-key. Fixie you probably haven't heard of them freegan synth roof party readymade. Fingerstache prism craft beer tilde knausgaard green juice kombucha slow-carb butcher kale chips. Snackwave organic tbh ennui XOXO. Hell of woke blue bottle, tofu roof party food truck pok pok thundercats. Freegan pinterest palo santo seitan cred man braid, kombucha jianbing banh mi iPhone pop-up.
-
->Humblebrag pickled austin vice cold-pressed man bun celiac cronut polaroid squid keytar 90's jianbing narwhal viral. Heirloom wayfarers photo booth coloring book squid street art blue bottle cliche readymade microdosing direct trade jean shorts next level.
-
-Selvage messenger bag meh godard. Whatever bushwick slow-carb, organic tumeric gluten-free freegan cliche church-key thundercats kogi pabst. Hammock deep v everyday carry intelligentsia hell of helvetica. Occupy affogato pop-up bicycle rights paleo. Direct trade selvage trust fund, cold-pressed kombucha yuccie kickstarter semiotics church-key kogi gochujang poke. Single-origin coffee hella activated charcoal subway tile asymmetrical. Adaptogen normcore wayfarers pickled lomo. Ethical edison bulb shaman wayfarers cold-pressed woke. Helvetica selfies blue bottle deep v. Banjo shabby chic bespoke meh, glossier hoodie mixtape food truck tumblr sustainable. Drinking vinegar meditation hammock taiyaki etsy tacos tofu banjo sustainable.
-
-Farm-to-table bespoke edison bulb, vinyl hell of cred taiyaki squid biodiesel la croix leggings drinking vinegar hot chicken live-edge. Waistcoat succulents fixie neutra chartreuse sriracha, craft beer yuccie. Ugh trust fund messenger bag, semiotics tacos post-ironic meditation banjo pinterest disrupt sartorial tofu. Meh health goth art party retro skateboard, pug vaporware shaman. Meh whatever microdosing cornhole. Hella salvia pinterest four loko shabby chic yr. Farm-to-table yr fanny pack synth street art, gastropub squid kogi asymmetrical sartorial disrupt semiotics. Kombucha copper mug vice sriracha +1. Tacos hashtag PBR&B taiyaki franzen cornhole. Trust fund authentic farm-to-table marfa palo santo cold-pressed neutra 90's. VHS artisan drinking vinegar readymade yr. Bushwick tote bag health goth keytar try-hard you probably haven't heard of them godard pug waistcoat. Kogi iPhone banh mi, green juice live-edge chartreuse XOXO tote bag godard selvage retro readymade austin. Leggings ramps tacos iceland raw denim semiotics woke hell of lomo. Brooklyn woke adaptogen normcore pitchfork skateboard.
-
-Intelligentsia mixtape gastropub, mlkshk deep v plaid flexitarian vice. Succulents keytar craft beer shabby chic. Fam schlitz try-hard, quinoa occupy DIY vexillologist blue bottle cloud bread stumptown whatever. Sustainable cloud bread beard fanny pack vexillologist health goth. Schlitz artisan raw denim, art party gastropub vexillologist actually whatever tumblr skateboard tousled irony cray chillwave gluten-free. Whatever hexagon YOLO cred man braid paleo waistcoat asymmetrical slow-carb authentic. Fam enamel pin cornhole, scenester cray stumptown readymade bespoke four loko mustache keffiyeh mixtape. Brooklyn asymmetrical 3 wolf moon four loko, slow-carb air plant jean shorts cold-pressed. Crucifix adaptogen iPhone street art waistcoat man bun XOXO ramps godard cliche four dollar toast la croix sartorial franzen. Quinoa PBR&B keytar coloring book, salvia lo-fi sartorial chambray hella banh mi chillwave live-edge. Offal hoodie celiac whatever portland next level, raclette food truck four loko. Craft beer kale chips banjo humblebrag brunch ugh. Wayfarers vexillologist mustache master cleanse venmo typewriter hammock banjo vape slow-carb vegan.
diff --git a/_posts/2017-09-12-how-to-start-programming.markdown b/_posts/2017-09-12-how-to-start-programming.markdown
deleted file mode 100644
index 48e3b9813b..0000000000
--- a/_posts/2017-09-12-how-to-start-programming.markdown
+++ /dev/null
@@ -1,17 +0,0 @@
----
-layout: post
-title: How To Start Programming
-date: 2017-09-12 00:00:00 +0300
-description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
-img: how-to-start.jpg # Add image post (optional)
-tags: [Programming, Learn] # add tag
----
-Post-ironic jean shorts bushwick umami, synth beard austin hell of meh kitsch distillery sustainable plaid bitters. Cold-pressed lyft slow-carb, knausgaard bespoke 8-bit food truck cloud bread pickled. Taiyaki bitters trust fund heirloom craft beer single-origin coffee. Readymade fam vape blue bottle cold-pressed, flannel polaroid. Aesthetic four dollar toast semiotics af bicycle rights. Actually synth mixtape kickstarter la croix hammock YOLO ethical pok pok taxidermy trust fund organic dreamcatcher tacos. Franzen four loko man braid letterpress umami offal. Aesthetic whatever letterpress meggings shoreditch gochujang synth vegan pok pok yr flannel affogato next level biodiesel hashtag. Banjo vaporware lyft unicorn tumblr. Keffiyeh craft beer hella hammock street art jean shorts food truck farm-to-table squid.
-
->Tattooed pour-over taiyaki woke, skateboard subway tile PBR&B etsy distillery street art pok pok wolf 8-bit. Vegan bicycle rights schlitz subway tile unicorn taiyaki.
-
-Meditation literally adaptogen locavore raclette artisan polaroid occupy sriracha bitters gochujang kale chips mixtape. Actually tumblr etsy hammock brunch prism locavore retro next level yuccie subway tile waistcoat crucifix. Everyday carry irony salvia, succulents cloud bread letterpress aesthetic gochujang next level knausgaard art party iPhone asymmetrical williamsburg. Iceland slow-carb knausgaard narwhal skateboard kitsch fashion axe. Man bun celiac street art, cliche PBR&B lomo blue bottle beard bitters. Mlkshk occupy offal dreamcatcher. Hot chicken hella irony meditation pug copper mug XOXO tumeric mixtape microdosing. Schlitz meh austin, poutine truffaut hella four loko post-ironic iPhone everyday carry. Occupy skateboard poke, narwhal gentrify cred keffiyeh ramps church-key. Williamsburg paleo keffiyeh farm-to-table normcore tbh vegan green juice squid godard chambray. DIY organic letterpress, venmo salvia crucifix gluten-free. Yr celiac tbh selfies activated charcoal.
-
-Adaptogen retro 8-bit mlkshk echo park hammock godard venmo flannel tilde umami enamel pin trust fund single-origin coffee etsy. Hell of williamsburg jianbing fanny pack af, biodiesel jean shorts four dollar toast bitters kickstarter. DIY edison bulb keffiyeh raclette. Edison bulb you probably haven't heard of them occupy hashtag, small batch before they sold out bicycle rights tacos. IPhone selfies banh mi sartorial, typewriter seitan plaid. Fanny pack williamsburg gentrify plaid hoodie. Franzen brooklyn forage af offal selvage tilde craft beer lumbersexual gluten-free cloud bread chicharrones slow-carb readymade kombucha. Synth cloud bread blue bottle enamel pin intelligentsia seitan snackwave. Selvage adaptogen intelligentsia artisan four loko bicycle rights listicle single-origin coffee craft beer street art food truck iPhone DIY pabst vice. Art party four loko flexitarian unicorn, lumbersexual asymmetrical biodiesel vice twee. Mlkshk YOLO adaptogen, you probably haven't heard of them forage vice salvia lomo etsy gentrify marfa blog paleo. Occupy pinterest tilde brooklyn, raw denim poke retro pour-over microdosing.
-
-Skateboard keytar actually disrupt taiyaki, synth biodiesel. Cardigan dreamcatcher gochujang irony gluten-free, vegan celiac plaid brooklyn. Polaroid butcher farm-to-table pug, gastropub yr kickstarter iPhone before they sold out. Marfa cornhole migas hashtag flannel fashion axe deep v kogi. Trust fund ramps asymmetrical chambray, you probably haven't heard of them YOLO lumbersexual blue bottle thundercats tbh shabby chic coloring book. Kickstarter ugh try-hard four dollar toast master cleanse. Semiotics bespoke art party twee roof party cardigan. Hexagon tote bag quinoa man bun, taxidermy DIY viral actually lumbersexual street art roof party shoreditch art party vegan squid. Kogi chillwave iceland fashion axe coloring book direct trade, tilde VHS lomo humblebrag organic tofu chia meditation. Hella keytar shabby chic 90's taxidermy tacos marfa. Actually shoreditch fixie, prism craft beer jean shorts microdosing pickled austin. Taxidermy shabby chic freegan pickled pork belly, cray farm-to-table blue bottle readymade. 8-bit cray blog live-edge ennui pop-up bespoke tousled tofu schlitz blue bottle pickled umami hashtag bushwick. Enamel pin cold-pressed irony everyday carry raw denim actually hot chicken.
diff --git a/_posts/2017-09-12-the-best-organizer-software.markdown b/_posts/2017-09-12-the-best-organizer-software.markdown
deleted file mode 100644
index c2178cba9d..0000000000
--- a/_posts/2017-09-12-the-best-organizer-software.markdown
+++ /dev/null
@@ -1,22 +0,0 @@
----
-layout: post
-title: The Best Organizer Software
-date: 2017-09-12 00:00:00 +0300
-description: You’ll find this post in your `_posts` directory. Go ahead and edit it and re-build the site to see your changes. # Add post description (optional)
-img: software.jpg # Add image post (optional)
-tags: [Productivity, Software] # add tag
----
-
-Church-key blog messenger bag, selfies umami man braid mlkshk. Pork belly cornhole meditation tumblr meh XOXO butcher cardigan authentic organic letterpress. Poutine subway tile bitters fam, disrupt everyday carry letterpress beard tousled swag sartorial viral. Retro af 3 wolf moon heirloom, pork belly man bun DIY chillwave. Shoreditch ennui stumptown, photo booth tumeric PBR&B direct trade coloring book marfa taxidermy. Gentrify brunch typewriter woke freegan. Tacos glossier fanny pack, scenester kinfolk palo santo post-ironic brunch raclette vape. Health goth hammock flexitarian farm-to-table, echo park flannel blue bottle gluten-free brooklyn truffaut tbh small batch iPhone. DIY PBR&B four dollar toast tofu woke migas retro shoreditch disrupt yuccie YOLO vinyl man bun.
-
-### Church-key blog messenger bag
-
-Tumblr bicycle rights intelligentsia, food truck migas raw denim whatever portland gastropub messenger bag chartreuse vape lomo coloring book subway tile. Yr pabst meggings tattooed four dollar toast. Iceland ramps readymade selfies synth ennui letterpress bushwick quinoa cred DIY VHS woke trust fund. Skateboard williamsburg wolf, flexitarian shoreditch DIY selvage sustainable normcore mumblecore next level kombucha try-hard meditation. Gentrify plaid microdosing, master cleanse ugh crucifix pop-up. Wolf bushwick street art tumeric. Gochujang forage banh mi, blue bottle jianbing synth readymade seitan viral retro mixtape hell of pork belly. Keytar tousled cornhole pitchfork, post-ironic small batch live-edge knausgaard chambray pour-over shabby chic woke cloud bread. Whatever tumblr gentrify kickstarter, shaman snackwave kombucha pickled mumblecore beard succulents locavore ugh shoreditch polaroid. Wayfarers crucifix tattooed twee. Yr listicle crucifix fingerstache farm-to-table. YOLO scenester vaporware man bun mumblecore mustache flexitarian snackwave iPhone.
-
-Hella lo-fi banjo, disrupt tofu prism raclette. Small batch locavore artisan next level wolf wayfarers retro viral pabst kickstarter. Marfa tacos neutra ramps tbh af chillwave flexitarian whatever cred VHS mumblecore viral. Hell of retro vegan chambray tacos VHS four dollar toast tote bag. Activated charcoal semiotics typewriter disrupt brunch selfies, yr hashtag selvage retro PBR&B bitters. Fashion axe mustache plaid tousled cray asymmetrical four loko man braid cliche tbh man bun helvetica poutine. Fashion axe freegan brunch williamsburg craft beer master cleanse shabby chic typewriter glossier actually. Plaid tumblr hexagon neutra slow-carb mumblecore. Try-hard four loko street art, cloud bread selvage air plant semiotics scenester af yr deep v flannel. Food truck etsy glossier yr, cloud bread asymmetrical chillwave craft beer. Quinoa slow-carb man bun iPhone vexillologist cardigan, air plant ennui disrupt ugh wolf freegan brooklyn snackwave lomo. Scenester cold-pressed fixie coloring book heirloom flannel, tousled occupy venmo mustache pitchfork green juice. VHS neutra 8-bit roof party. Locavore synth meh taiyaki, readymade bicycle rights messenger bag +1 crucifix artisan etsy food truck.
-
-### Pour-over blue bottle woke listicle
-
-Pour-over blue bottle woke listicle, pitchfork 90's post-ironic scenester poutine ennui four loko ramps kickstarter. Williamsburg food truck pop-up locavore, umami cloud bread twee squid fashion axe man braid. Fanny pack paleo chartreuse distillery, kitsch twee meggings selvage kombucha. Keffiyeh actually prism listicle. Taxidermy authentic iPhone migas vegan copper mug. Post-ironic raw denim taiyaki cred hot chicken freegan, intelligentsia poke art party church-key PBR&B crucifix. Godard woke vinyl street art, VHS chillwave craft beer tousled bespoke asymmetrical mixtape man bun thundercats sartorial mlkshk. Meggings heirloom XOXO gentrify try-hard stumptown. Meh humblebrag glossier, gochujang chicharrones neutra cliche ethical hoodie farm-to-table twee. Messenger bag offal pug bespoke, put a bird on it tote bag literally.
-
-Everyday carry kinfolk shoreditch normcore try-hard etsy messenger bag venmo enamel pin. Try-hard fanny pack thundercats farm-to-table retro twee. Godard photo booth tofu 90's. Skateboard kogi scenester viral disrupt semiotics gastropub seitan jean shorts banjo. Humblebrag knausgaard waistcoat mixtape. Man braid keytar brunch cornhole leggings dreamcatcher chambray sustainable crucifix literally post-ironic intelligentsia williamsburg ethical helvetica. Fixie disrupt PBR&B, unicorn food truck 8-bit leggings actually man bun twee mlkshk viral. Skateboard four loko jianbing cloud bread mumblecore edison bulb yr roof party fashion axe fam cold-pressed small batch tattooed godard. Bushwick yuccie thundercats schlitz listicle skateboard quinoa. Gentrify hot chicken pop-up keytar master cleanse pork belly. Irony pitchfork la croix neutra freegan. Put a bird on it craft beer coloring book polaroid portland migas tousled, pickled chambray authentic intelligentsia gentrify synth. Letterpress tumblr wolf normcore selvage. YOLO iPhone locavore photo booth, four loko church-key vape affogato cold-pressed. Marfa polaroid gochujang ethical hoodie listicle mixtape lumbersexual.
diff --git a/_posts/2020-07-08-vba-macros.md b/_posts/2020-07-08-vba-macros.md
new file mode 100644
index 0000000000..b2941cfd72
--- /dev/null
+++ b/_posts/2020-07-08-vba-macros.md
@@ -0,0 +1,44 @@
+---
+layout: post
+title: "VBA EXCEL macros - automation"
+date: 2023-07-20 13:32:20 +0300
+description: Some examples of Auotmation done with VBA. The Excel macros are showcased here and can also be downloaded. # Add post description (optional)
+#url:
+img: vbaexcel.jpg # Add image post (optional)
+fig-caption: Some examples of Auotmation done with VBA. The Excel macros are showcased here and can also be downloaded.
+tags: [EXCEL, VBA, automation, coding]
+---
+
+Here are some examples of Auotmation done with VBA. The Excel macros are showcased here and can also be downloaded.
+
+Please wait a few seconds for the MACROS to load!
+
+I have worked with VBA since I joined Evalueserve in 2017 and here below are some of the macros I have built. I also built VBA macros while working at Jarislowsky (I wasn't allowed to take my work with me however). In fact, at Jarislowsky I fully automated many of my processes (using IT tools and SQL as well).
+
+
+
+### 1. Macro 'Americas' - Pricing Macro
+
+I developped this macro while I was working at Evalueserve. This macro was linked to a Deutch bank server, to retrieve the data the macro connected with the server and used SQL queries.
+
+
+
+
+
+### 2. Budget tracker
+
+
+
+
+
+### 1. International Securities Identification Number
+
+
+
+
+
+
+
+
+### 1. Portfolio management macro
+
\ No newline at end of file
diff --git a/_posts/2021-01-08-SQL-exercises.md b/_posts/2021-01-08-SQL-exercises.md
new file mode 100644
index 0000000000..69c15d36c4
--- /dev/null
+++ b/_posts/2021-01-08-SQL-exercises.md
@@ -0,0 +1,212 @@
+---
+layout: post
+title: "SQL Exercises"
+date: 2023-07-20 13:32:20 +0300
+description: Sample SQL exercises dones during my one of my Business Intelligence class. These are complex queries! # Add post description (optional)
+#url:
+img: sql.png # Add image post (optional)
+fig-caption: Sample SQL exercises dones during my one of my Business Intelligence class. These are complex queries!
+tags: [sql]
+---
+
+# SQL exercises
+
+Below are some exercises I had to work on for my 'Business Intelligence Techniques' class. Every exercise took some time to figure out, but I decided it would be a good sample work to showcase how much I can accomplish. During this class I also learned about Related database concepts and SQL concepts. The class was in french therefore the questions are (for now) in french, I will work on their translations in the days to come (starting from now 20th of July 2023).
+
+``` sql
+/*
+ * TECH 60701 -- Technologies de l'intelligence d'affaires
+ * HEC Montréal
+*/
+ use AdventureWorks2019
+ go
+
+/*
+ Question #1 :
+ AdventureWorks would like to implement former General Electric Chairman and CEO Jack Welch's "The vitality model",
+ Jack Welch, which has been described as a "20-70-10" system. The "most important 20%" of employees are the most productive, and 70% (the "indispensable 70
+ indispensable") work well. The remaining 10% are non-producers and must be let go.
+
+ Using a ranking clause and a subquery, you need to write a query to identify the "top 20%" of sales people
+ (to congratulate and encourage them!) as well as the bottom 10% (to kick them out )!!!
+ So we don't want the salespeople belonging to the remaining 70% to appear in the report.
+
+ (By salespeople, we're referring to sales clerks, regardless of job title.)
+
+ Since AdventureWorks sells mainly bicycles, spring (March to May, incl.) is crucial to its financial results.
+ Therefore, the analysis should only take into account the subtotal sales that salespeople have achieved for this
+ period, whatever the year. For each result, the following should be displayed
+
+ - Salesperson ID
+ - Seller's National ID Number
+ - Seller's first name
+ - Seller's surname
+ - The seller's subtotal sales for the fourth quarter (formatted in dollars, i.e. $xxx.xx)
+ - The seller's percentage rank (formatted as a percentage with two points of precision)
+ - The decile to which the seller belongs
+ - Personalized message for: 1st decile 'Excellent performance!'; 2nd decile 'Keep up the good work!
+ 10th decile 'Are you looking for a job elsewhere?
+*/
+
+
+
+select
+ *,
+ Case
+ when Decile =1 then 'Excellente performance !'
+ when Decile =2 then 'Continuez, vous allez bien !'
+ when Decile =10 then 'Cherchez vous un emploi ailleurs !'
+ else 'Whatever'
+ end as 'Status'
+ from (
+ select
+ soh.SalesPersonID,
+ e.NationalIDNumber,
+ p.FirstName,
+ p.LastName,
+ FORMAT(SUM(soh.SubTotal), 'c', 'en-us') as 'Somme sous-total Ventes',
+ /*Le rang en pourcentage du vendeur (formaté en pourcentage avec deux points de précision)*/
+ FORMAT(ROUND(percent_rank() over(order by sum(soh.SubTotal) desc), 2), 'p') as 'Le rang en %',
+ NTILE(10) over(order by SUM(soh.SubTotal) desc) as 'Decile'
+ from
+ Sales.SalesOrderHeader soh
+ inner join Sales.SalesPerson sp1 on soh.SalesPersonID = sp1.BusinessEntityID
+ inner join Person.Person p on sp1.BusinessEntityID = p.BusinessEntityID
+ inner join HumanResources.Employee e on p.BusinessEntityID= e.BusinessEntityID
+
+ where Month(OrderDate) in (3,4,5)
+ group by soh.SalesPersonID, e.NationalIDNumber, p.FirstName, p.LastName
+ ) as Table2
+ where Decile in (1,2,10)
+
+
+
+/*
+ Question #2 :
+ AdventureWorks would like to explore its customers' purchases of accessories (non-manufactured products). We are particularly interested in accessories that were ordered
+ were ordered by stores located in Canada at the same time as they made bicycle purchases (products manufactured by AdventureWorks).
+ Therefore, data should be displayed only for sales made to stores (not individual customers) who purchased bicycles.
+
+ Using a CTE, you should display a list containing information grouped by product identifier, product name,
+ product number.
+
+ Your report should contain only four columns, as follows:
+
+
+
+ ProductID |Name |ProductNumber |OrderCount |Rang
+ 715 |Long-Sleeve Logo Jersey, L |LJ-0192-L |238 |1
+ 712 |AWC Logo Cap |CA-1098 |237 |2
+ 708 |Sport-100 Helmet, Black |HL-U509 |190 |3
+ ... |... |... |... |...
+
+ This indicates, for example, that of all the orders placed by stores in which manufactured products were purchased, 238
+ orders also included the purchase of product 715 (Long-Sleeve Logo Jersey, L), 237 orders included the purchase of product 712 (AWC Logo Cap),
+ etc. The rank used does not allow value jumps.
+
+ Sort by "OrderCount", in descending order.
+*/
+ --7385
+with CTEQ2(ProductID, Name, ProductNumber,SalesOrderID, SalesOrderDetailID) as
+(
+select
+pt.ProductID, pt.Name, pt.ProductNumber,
+soh.SalesOrderID, sod.SalesOrderDetailID
+
+ from Sales.SalesOrderHeader soh
+ inner join Sales.Customer c on c.CustomerID = soh.CustomerID
+ inner join Person.BusinessEntityAddress bea on c.StoreID = bea.BusinessEntityID
+ inner join person.Address a on a.AddressID = bea.AddressID
+ inner join Sales.SalesOrderDetail sod on soh.SalesOrderID = sod.SalesOrderID
+ inner join Production.Product pt on sod.ProductID = pt.ProductID
+ inner join Person.StateProvince sp on sp.StateProvinceID =a.StateProvinceID
+ where sp.CountryRegionCode = 'CA' AND pt.MakeFlag =1 --AND soh.SalesOrderID='55280'
+ )
+ select
+ p.ProductID,
+ p.Name,
+ p.ProductNumber,
+ COUNT( distinct(CTEQ2.SalesOrderID)) as OrderCount,
+ dense_rank()over(order by COUNT(distinct(CTEQ2.SalesOrderID)) desc) as 'Rang'
+ from CTEQ2
+ inner join Sales.SalesOrderDetail sod on sod.SalesOrderID = CTEQ2.SalesOrderID and sod.SalesOrderDetailID <> CTEQ2.SalesOrderDetailID
+ inner join Production.Product p on p.ProductID = sod.ProductID
+ where p.MakeFlag =0
+ group by p.ProductID, p.[Name], p.ProductNumber;
+
+
+
+
+/*
+ Question #3 a) :
+ You are asked to provide a query displaying the following details of active suppliers, with preferred status, from whom
+ from whom AdventureWorks has placed fewer than 30 orders. Show:
+ - Supplier ID
+ - Order date
+ - A sequence number assigned to each order placed with the supplier, starting with the most recent order
+ - The subtotal of each order (formatted in dollars, i.e. $xxx.xx)
+*/
+
+
+
+select poh.VendorID
+
+ , poh.OrderDate, row_number()over(partition by poh.VendorID order by OrderDate desc) as 'No_sequence' , FORMAT(poh.SubTotal, 'C')
+ from Purchasing.PurchaseOrderHeader poh
+ inner join Purchasing.Vendor v on v.BusinessEntityID = poh.VendorID
+ where ActiveFlag =1 AND PreferredVendorStatus=1 AND
+ poh.VendorID in (select VendorID from Purchasing.PurchaseOrderHeader group by VendorID having count(PurchaseOrderID)<=30);
+
+
+
+
+
+/*
+ Question #3 b) :
+ AdventureWorks would like to know which of these preferred suppliers (with whom AdventureWorks has placed fewer than 30 orders) tends to
+ tend to increase their prices. The company would like to use this information to remove their "preferred supplier" status.
+ supplier" status. The assumption here is that orders from a supplier remain stable over time and are therefore
+ always for similar products/quantities.
+
+ Using a CTE based on the query in Part a), build a query that will display the list of suppliers for which
+ the average amount (using the subtotal) of their three most recent orders is greater than the average amount they have requested
+ AdventureWorks to date.
+
+ We'd like to display :
+ - Supplier ID
+ - The average amount of all orders placed with the supplier
+ - The average amount of the three most recent orders placed with the supplier
+ - The difference between the average amount of the three most recent orders placed with the supplier and the average amount of all orders placed with the supplier.
+ all orders placed with the supplier.
+
+ Your report should contain only these four columns, and be filtered by the reduction in acquisition costs, so that
+ so that the largest reduction is at the top of the list. All amounts must be formatted in dollars, i.e. $xxx.xx.
+*/
+
+
+
+
+with CTEQ3b(VendorID, OrderDate, RowNum, SubTotal) as
+(select poh.VendorID
+ , poh.OrderDate,
+ row_number()over(partition by poh.VendorID order by OrderDate desc) ,
+ poh.SubTotal
+ from Purchasing.PurchaseOrderHeader poh
+ inner join Purchasing.Vendor v on v.BusinessEntityID = poh.VendorID
+ where v.ActiveFlag =1 AND v.PreferredVendorStatus=1 AND
+ poh.VendorID in (select VendorID from Purchasing.PurchaseOrderHeader group by VendorID having count(PurchaseOrderID)<=30)
+
+)
+select
+c.VendorID,
+
+(select format(avg(Subtotal),'C') from Purchasing.PurchaseOrderHeader poh where c.VendorID=poh.VendorID) as 'total',
+format(avg(c.SubTotal),'C') as 'total 3' ,
+Format(avg(SubTotal) - (select avg(Subtotal) from Purchasing.PurchaseOrderHeader poh where c.VendorID=poh.VendorID),'C') as 'difference'
+from CTEQ3b c
+where c.RowNum<=3
+group by c.VendorID
+having avg(SubTotal) - (select avg(Subtotal) from Purchasing.PurchaseOrderHeader poh where c.VendorID=poh.VendorID) >0
+order by avg(SubTotal) - (select avg(Subtotal) from Purchasing.PurchaseOrderHeader poh where c.VendorID=poh.VendorID) desc
+
+```
\ No newline at end of file
diff --git a/_posts/2021-05-05-Twitter-API-V2.md b/_posts/2021-05-05-Twitter-API-V2.md
new file mode 100644
index 0000000000..052a247992
--- /dev/null
+++ b/_posts/2021-05-05-Twitter-API-V2.md
@@ -0,0 +1,585 @@
+---
+layout: post
+title: "Twitter API V2"
+date: 2023-07-20 13:32:20 +0300
+description: Python code to retreive twitters with V2 API version. I modified this code so I could retreive exactly what I needed. I used this code was used to retreive the data I used for my Master's Thesis.
+ # Add post description (optional)
+#url:
+img: twitter-api.jpg # Add image post (optional)
+fig-caption:
+tags: [twitter, coding, python]
+---
+
+This application was made with Twitter API version 2. It is built with Python.
+
+This is a python code I modified so I could get more twitter information (taken from beyond data science website). I added an algorithm so I could retrieve hourly tweets. I also added a piece of code to retrieve 3 more files containing Twitter information (user info,place info, retweet info) in addition to the main file. I retreived everything on CSV files.
+
+This produces 4 csv files containing Twitter information.
+
+file'AcademicMain' fields : 'author id', 'created_at', 'place_id', 'referenced_id', 'Retweet', 'id', 'conversation_id' ,'lang', 'source', 'tweet', 'username_mentioned', 'username_mentioned_id', 'urls_expanded', 'urls', 'tag
+
+file AcademicUsers contains fields: 'author id','username', 'place_id', 'description', 'name', 'followers_count', 'following_count', 'verified', 'profile_image_url' file AcademicRetweetInfo contains fields: 'conversation_id', 'referenced_id','place_id', 'text2','username_mentioned2','username_mentioned_id2', 'url2', 'urls_expanded2', 'tag2' file AcademicPlaces contains fields: 'place_id','name_country', 'full_name_country', 'name_country', 'country_code', 'place_type'
+
+After obtaining the files you will need to merge the author id from 'AcademicMain' file with the 'AcademicUsers' file (with author id), then AcademicRetweetInfo contains reference id, which needs to be merged with reference id with 'AcademicMain, then 'AcademicPlaces' place id needs to be merged with place id in 'AcademicPlaces'
+
+
+``` python
+
+
+## For sending GET requests from the API
+import requests
+# For saving access tokens and for file management when creating and adding to the dataset
+import os
+# For dealing with json responses we receive from the API
+import json
+# For displaying the data after
+#import pandas as pd
+# For saving the response data in CSV format
+import csv
+# For parsing the dates received from twitter in readable formats
+import datetime
+import dateutil.parser
+import unicodedata
+#To add wait time between requests
+import time
+from pathlib import Path
+
+
+
+import datetime
+from datetime import datetime
+#timestamp = pd.Timestamp('2020-5-23')
+import pytz
+from datetime import date, timedelta
+import datetime
+
+os.environ['TOKEN'] = 'AAAAAAAAAAAAAAAAAAAAAAggUwEAAAAAcvH6Nz7S%2BPfeswddbEoiVp4%2BLtY%3DoKgXGUTKQkCiFWkaQpc8DWtR3aBJQlZl7N3lOEmcUhmU9ybxuK'
+
+def listdates(a, b):
+ sdate = a # start date
+ edate = b # end date
+ delta = edate - sdate # as timedelta
+ begin_list =[]
+ end_list =[]
+ for i in range(delta.days + 1):
+ day = sdate + timedelta(days=i)
+ year = day.strftime("%Y")
+
+ month = day.strftime("%m")
+
+ day = day.strftime("%d")
+
+
+ begin_time = datetime.datetime(int(year), int(month), int(day), 0)
+ #local_dt = local.localize(begin_time, is_dst=None)
+ #utc_dt = local_dt.astimezone(pytz.utc)
+
+ m= begin_time.isoformat("T") + ".000Z"
+ begin_list.append(m)
+
+ n=12
+ m=0
+ s=0
+ # Add 2 hours to datetime object
+ final_time= begin_time+ timedelta(hours=n, minutes=m, seconds=s)
+ final_t = final_time.isoformat("T") + ".000Z"
+ end_list.append(final_t)
+
+
+ return(begin_list, end_list)
+#time_change = datetime.timedelta(hours=10)
+#new_time = date_and_time + time_change
+
+
+
+
+def auth():
+ return os.getenv('TOKEN')
+
+def create_headers(bearer_token):
+ headers = {"Authorization": "Bearer {}".format(bearer_token)}
+ return headers
+
+def create_url(keyword, start_date, end_date, max_results):
+ search_url = "https://api.twitter.com/2/tweets/search/all" # Change to the endpoint you want to collect data from
+
+ # change params based on the endpoint you are using
+ query_params = {'query': keyword,
+ 'start_time': start_date,
+ 'end_time': end_date,
+ 'max_results': max_results,
+ 'expansions': 'author_id,in_reply_to_user_id,geo.place_id,referenced_tweets.id,attachments.media_keys',
+ 'tweet.fields': 'id,text,author_id,in_reply_to_user_id,geo,conversation_id,created_at,public_metrics,lang,entities,reply_settings,source',
+ 'user.fields': 'id,name,username,created_at,description,public_metrics,verified',
+ 'place.fields': 'full_name,id,country,country_code,geo,name,place_type',
+ 'next_token': {}}
+ return (search_url, query_params)
+
+
+def connect_to_endpoint(url, headers, params, next_token = None):
+ params['next_token'] = next_token #params object received from create_url function
+ response = requests.request("GET", url, headers = headers, params = params)
+ print("Endpoint Response Code: " + str(response.status_code))
+ if response.status_code != 200:
+ raise Exception(response.status_code, response.text)
+ return response.json()
+
+
+def write_json(new_data, filenamejson):
+ # with open(filename, 'w') as f:
+ # json.dump(new_data, f, indent=4, sort_keys=True)
+
+ jsonfile = open(filenamejson, 'a')
+ json.dump(new_data, jsonfile, indent=4, sort_keys=True)
+
+
+def append_to_csv(json_response, fileName):
+ # A counter variable
+ counter = 0
+
+ # Open OR create the target CSV file
+ csvFile = open(fileName, "a", newline="", encoding='utf-8')
+ csvWriter = csv.writer(csvFile)
+
+ #
+
+ # Loop through each tweet
+ for tweet in json_response['data']:
+
+ # We will create a variable for each since some of the keys might not exist for some tweets
+ # So we will account for that
+
+ # 1. Author ID
+ author_id = str("'" + tweet['author_id'])
+
+ # 2. Time created
+ created_at = dateutil.parser.parse(tweet['created_at'])
+ ###'place.fields': 'full_name,id,country,country_code,geo,name,place_type',
+ # 3. Geolocation
+ if ('geo' in tweet):
+ if('place_id' in tweet['geo']):
+ place_id = tweet['geo']['place_id']
+ else:
+ place_id = " "
+
+ else:
+ place_id = " "
+
+
+ if('referenced_tweets' in tweet):
+ referenced_id= str("'" + tweet['referenced_tweets'][0]['id'])
+
+
+ Retweet=str("'" + tweet['referenced_tweets'][0]['type'])
+
+ else:
+ referenced_id=' '
+ Retweet=' '
+
+
+
+ # 4. Tweet ID
+ tweet_id = str("'" + tweet['id'])
+
+
+
+ conversation_id = str("'" + tweet['conversation_id'])
+
+
+
+
+ # 5. Language
+ lang = tweet['lang']
+
+ # 6. Tweet metrics
+
+ # 7. source
+ source = tweet['source']
+
+ # 8. Tweet text
+ text = tweet['text']
+
+
+
+
+ if('entities' in tweet):
+
+ if('mentions' in tweet['entities']):
+
+
+
+ d= len(tweet['entities']['mentions'])
+ user1 =[]
+ user2=[]
+ for i in range(d):
+ user1.append(tweet['entities']['mentions'][i]['username'])
+ user2.append(tweet['entities']['mentions'][i]['id'])
+
+ username_mentioned = user1
+ #print(username_mentioned)
+ username_mentioned_id = user2
+ # print(username_mentioned_id)
+
+
+ else:
+ username_mentioned=' '
+ username_mentioned_id=' '
+
+ if('urls' in tweet['entities']):
+ d= len(tweet['entities']['urls'])
+ url1 =[]
+ url2 =[]
+ for i in range(d):
+ url1.append(tweet['entities']['urls'][i]['expanded_url'])
+ url2.append(tweet['entities']['urls'][i]['url'])
+
+ urls_expanded= url1
+ urls = url2
+ else:
+ urls_expanded=' '
+ urls=' '
+ if('hashtags' in tweet['entities']):
+ tag1 =[]
+ d= len(tweet['entities']['hashtags'])
+ for i in range(d):
+ tag1.append(tweet['entities']['hashtags'][i]['tag'])
+ tag=tag1
+
+ else:
+ tag=' '
+ else:
+ username_mentioned=' '
+ username_mentioned_id=' '
+ urls_expanded=' '
+ urls=' '
+ tag=' '
+
+
+ res = [ author_id, created_at, place_id, referenced_id, Retweet, tweet_id, conversation_id, lang, source, text, username_mentioned,
+ username_mentioned_id, urls_expanded, urls, tag]
+
+
+ csvWriter.writerow(res)
+ counter += 1
+
+
+#When done, close the CSV file
+ csvFile.close()
+
+# Print the number of tweets for this iteration
+ print("# of Tweets added from this response: ", counter)
+
+
+
+
+
+def append_to_csvUsers(json_response, fileName):
+ # A counter variable
+ counter = 0
+
+ # Open OR create the target CSV file
+ csvFile = open(fileName, "a", newline="", encoding='utf-8')
+ csvWriter = csv.writer(csvFile)
+
+ # Loop through each tweet
+ if('users' in json_response['includes']):
+ for tweet in json_response['includes']['users']:
+ #print(tweet)
+
+ author_id = str("'" + tweet['id'])
+ username = tweet['username']
+
+ if ('geo' in tweet):
+ if('place_id' in tweet['geo']):
+ place_id = tweet['geo']['place_id']
+ else:
+ place_id = " "
+ else:
+ place_id = " "
+ #print(username)
+ name=tweet['name']
+ description=tweet['description']
+ followers_count=tweet['public_metrics']['followers_count']
+ following_count=tweet['public_metrics']['following_count']
+ verified= tweet['verified']
+
+ if('profile_image_url' in tweet):
+ profile_image_url=tweet['profile_image_url']
+ else:
+ profile_image_url= ' '
+
+
+
+
+
+
+ res = [author_id, username, place_id, description, name, followers_count, following_count, verified, profile_image_url]
+
+
+ csvWriter.writerow(res)
+ counter += 1
+
+
+#When done, close the CSV file
+ csvFile.close()
+
+# Print the number of tweets for this iteration
+ print("# of Tweets added from this response: ", counter)
+
+
+
+
+def append_to_csvExtended(json_response, fileName):
+
+# A counter variable
+ counter = 0
+
+ # Open OR create the target CSV file
+ csvFile = open(fileName, "a", newline="", encoding='utf-8')
+ csvWriter = csv.writer(csvFile)
+
+
+ if('tweets' in json_response['includes']):
+
+ #print(json_response['includes']['tweets'])
+
+ for tweet in json_response['includes']["tweets"]:
+ #print(tweet)
+
+ conversation_id=str("'" +tweet['conversation_id'])
+ referenced_id= str("'" +tweet['id'])
+
+ if ('geo' in tweet):
+ if('place_id' in tweet['geo']):
+ place_id = tweet['geo']['place_id']
+ #print(place_id)
+ else:
+ place_id = " "
+
+ else:
+ place_id = " "
+
+
+ text2=tweet['text']
+
+ if('entities' in tweet):
+
+ if('mentions' in tweet['entities']):
+
+ #print((tweet['entities']['mentions']))
+
+ d= len(tweet['entities']['mentions'])
+ user1 =[]
+ user2=[]
+ for j in range(d):
+ user1.append(tweet['entities']['mentions'][j]['username'])
+ user2.append(tweet['entities']['mentions'][j]['id'])
+
+ username_mentioned2 = user1
+ username_mentioned_id2 = user2
+
+
+ else:
+ username_mentioned2=' '
+ username_mentioned_id2=' '
+
+ if('urls' in tweet['entities']):
+ d= len(tweet['entities']['urls'])
+ url1 =[]
+ url2 =[]
+ for j in range(d):
+ url1.append(tweet['entities']['urls'][j]['expanded_url'])
+ url2.append(tweet['entities']['urls'][j]['url'])
+
+ urls_expanded2= url1
+ urls2 = url2
+ #print(urls2)
+ else:
+ urls_expanded2=' '
+ urls2=' '
+ if('hashtags' in tweet['entities']):
+ tag1 =[]
+ d= len(tweet['entities']['hashtags'])
+ for j in range(d):
+ tag1.append(tweet['entities']['hashtags'][j]['tag'])
+ tag2=tag1
+
+ else:
+ tag2=' '
+
+ else:
+ username_mentioned2=' '
+ username_mentioned_id2=' '
+ urls_expanded2=' '
+ urls2=' '
+ tag2=' '
+
+
+
+ res = [conversation_id, referenced_id, place_id, text2, username_mentioned2, username_mentioned_id2, urls2, urls_expanded2, tag2 ]
+
+
+ csvWriter.writerow(res)
+ counter += 1
+
+#When done, close the CSV file
+ csvFile.close()
+
+# Print the number of tweets for this iteration
+ print("# of Tweets added from this response: ", counter)
+
+
+def append_to_csvPlaces(json_response, fileName):
+ # A counter variable
+ counter = 0
+
+ # Open OR create the target CSV file
+ csvFile = open(fileName, "a", newline="", encoding='utf-8')
+ csvWriter = csv.writer(csvFile)
+
+ # Loop through each tweet
+
+
+ #print(json_response['includes']['places'])
+ # n=len(json_response['includes']['places'])
+ if('places' in json_response['includes']):
+ for tweet in json_response['includes']['places']:
+
+ # print(tweet)
+ place_id = str("'" + tweet['id'])
+ # print(place_id)
+ name_country=tweet['name']
+ full_name_place=tweet['full_name']
+ country=(tweet['country'])
+ country_code=tweet['country_code']
+ place_type=tweet['place_type']
+
+
+
+ res = [place_id, name_country, full_name_place, name_country, country_code, place_type]
+
+
+ csvWriter.writerow(res)
+ counter += 1
+
+
+#When done, close the CSV file
+ csvFile.close()
+
+# Print the number of tweets for this iteration
+ print("# of Tweets added from this response: ", counter)
+
+#Inputs for tweets
+bearer_token = auth()
+headers = create_headers(bearer_token)
+
+keyword = 'onlyfans -promotion -promote lang:en'
+# '"new comer" "escort" "call girls" OR #callgirl lang:en'
+
+from datetime import timedelta, date
+ #2020-04-06
+answer = listdates(date(2021, 1, 16) , datetime.datetime.now().date() )
+
+start_list = answer[0]
+end_list = answer[1]
+max_results = 500
+
+
+#Total number of tweets we collected from the loop
+total_tweets = 0
+
+# Create file
+timestr = time.strftime("%Y%m%d-%H%M%S")
+filename1 = Path("/data") / ('AcademicMain' + timestr + ".csv")
+csvFile = open(filename1, "a", newline="", encoding='utf-8')
+csvWriter = csv.writer(csvFile)
+
+filename2 = Path("/data") / ('AcademicUsers' + timestr + ".csv")
+csvFile2 = open(filename2, "a", newline="", encoding='utf-8')
+csvWriter2 = csv.writer(csvFile2)
+
+
+filename3 = Path("/data") / ('AcademicRetweetInfo' + timestr + ".csv")
+csvFile3 = open(filename3, "a", newline="", encoding='utf-8')
+csvWriter3 = csv.writer(csvFile3)
+
+
+filename4 = Path("/data") / ('AcademicPlaces' + timestr + ".csv")
+csvFile4 = open(filename4, "a", newline="", encoding='utf-8')
+csvWriter4 = csv.writer(csvFile4)
+
+
+#Create headers for the data you want to save, in this example, we only want save these columns in our dataset
+csvWriter.writerow(['author id', 'created_at', 'place_id', 'referenced_id', 'Retweet', 'id', 'conversation_id' ,'lang',
+ 'source', 'tweet', 'username_mentioned', 'username_mentioned_id', 'urls_expanded', 'urls', 'tag'])
+
+csvWriter2.writerow(['author id','username', 'place_id', 'description', 'name', 'followers_count', 'following_count', 'verified', 'profile_image_url'])
+
+
+csvWriter3.writerow(['conversation_id', 'referenced_id','place_id', 'text2','username_mentioned2','username_mentioned_id2', 'url2', 'urls_expanded2', 'tag2'])
+
+csvWriter4.writerow(['place_id','name_country', 'full_name_country', 'name_country', 'country_code', 'place_type'])
+
+csvFile.close()
+csvFile2.close()
+csvFile3.close()
+csvFile4.close()
+
+for i in range(0,len(start_list)):
+
+ # Inputs
+ count = 0 # Counting tweets per time period
+ max_count = 1500 # Max tweets per time period
+ flag = True
+ next_token = None
+
+ # Check if flag is true
+ while flag:
+ # Check if max_count reached
+ if count >= max_count:
+ break
+ print("-------------------")
+ print("Token: ", next_token)
+ url = create_url(keyword, start_list[i],end_list[i], max_results)
+ json_response = connect_to_endpoint(url[0], headers, url[1], next_token)
+ result_count = json_response['meta']['result_count']
+ write_json(json_response, "/data/JSON" + timestr + ".json")
+
+ if 'next_token' in json_response['meta']:
+ # Save the token to use for next call
+ next_token = json_response['meta']['next_token']
+ print("Next Token: ", next_token)
+ if result_count is not None and result_count > 0 and next_token is not None:
+ print("Start Date: ", start_list[i])
+ append_to_csv(json_response, filename1)
+ append_to_csvUsers(json_response, filename2)
+ append_to_csvExtended(json_response, filename3)
+ append_to_csvPlaces(json_response, filename4)
+ count += result_count
+ total_tweets += result_count
+ print("Total # of Tweets added: ", total_tweets)
+ print("-------------------")
+ time.sleep(5)
+ # If no next token exists
+ else:
+ if result_count is not None and result_count > 0:
+ print("-------------------")
+ print("Start Date: ", start_list[i])
+ append_to_csv(json_response, filename1)
+ append_to_csvUsers(json_response, filename2)
+ append_to_csvExtended(json_response, filename3)
+ append_to_csvPlaces(json_response, filename4)
+ count += result_count
+ total_tweets += result_count
+ print("Total # of Tweets added: ", total_tweets)
+ print("-------------------")
+ time.sleep(5)
+
+ #Since this is the final request, turn flag to false to move to the next time period.
+ flag = False
+ next_token = None
+ time.sleep(5)
+print("Total number of results: ", total_tweets)
+
+
+```
\ No newline at end of file
diff --git a/_posts/2023-03-11-Thesis-Masters.md b/_posts/2023-03-11-Thesis-Masters.md
new file mode 100644
index 0000000000..cf8712f885
--- /dev/null
+++ b/_posts/2023-03-11-Thesis-Masters.md
@@ -0,0 +1,22 @@
+---
+layout: post
+title: Detecting Coordinated Activities through OnlyFans Tweets using Machine learning
+date: 2023-07-20 13:32:20 +0300
+description: This Thesis was conducted during my studies at HEC Montreal. Here I showcase my work in a PDF format that you can download.. # Add post description (optional)
+#url:
+img: thesis.png # Add image post (optional)
+fig-caption:
+tags: [machine learning, ai, statistics]
+---
+This Thesis was conducted during my studies at HEC Montreal. Here I showcase my work in a PDF format that you can download.
+
+I decided to pursue a masters degree with Thesis because I was very curious about machine learning research and social issues. I believed that the only way to satisfy my curiosity was to do a Thesis. I think I learned more technical skills doing my thesis than than taking classes and doing a short project. Here I worked with BIG data, python, Graphs, Mila servers and unsupervised learning techniques. I had the opportunity to work on a paper (you can look at the publication here paper) alongside a Phd student who specializes in Graphs and we had the honor of presenting our work at WebSci 2023 WebSci23.
+
+
+## Thesis Abstact
+In this thesis by articles, we present a research paper that we submitted for theWebSci’23 conference and is now under review. In addition to the article itself, in the thesis, we provide further detail regarding the motivation, background, literature review and research. The aim of this thesis is to provide a method that can facilitate the work of individuals combating online human trafficking. The majority of trafficking victims report being advertised online, this explains why online sex trafficking has been on the rise in the past few years. On the other hand, the use of OnlyFans as a platform for adult content has increased exponentially in the past three years, and Twitter has been its main advertising tool. Since we know that traffickers usually work within a network and control multiple victims, we suspect that there may be networks of traffickers promoting multiple OnlyFans accounts belonging to their victims. Based on these observations, we decided to conduct the first tstudy looking at organized activities on Twitter through OnlyFans advertisements. Preliminary analysis of this space shows that most tweets related to OnlyFans contains generic text, making text-based methods less reliable. Instead, focusing on what ties the authors of these tweets together, we propose a novel method for uncovering coordinated networks of users based on their behaviour. Our method, called Multi-Level Clustering (MLC), combines two levels of clustering. In the first level, we detect communities based on username Mentions and shared URLs, while the second level is done through two different approaches: i- the Partial Intersections (PI) of URLs and Mention communities ii- Joint Clustering (JT) by applying a subraph dense detection algorithm. We additionally successfully proved that our JT approach applied on synthetically generated data (with injected ground truth) shows a superior performance compared to competitive baselines. Furthermore, we apply the MLC to real-world data of tweets pertaining to OnlyFans and analyse the detected groups and show that our Partial Intersections provides good quality clusters (high entropy of OnlyFans accounts). Our paper and our thesis end with a discussion where we show carefully chosen examples of organized clusters and provide multiple interesting points that supports our method.
+
+
+Here below is my full Thesis
+
+
diff --git a/_posts/2023-07-13-Data-analysis-Kaggles.md b/_posts/2023-07-13-Data-analysis-Kaggles.md
new file mode 100644
index 0000000000..15b9464e94
--- /dev/null
+++ b/_posts/2023-07-13-Data-analysis-Kaggles.md
@@ -0,0 +1,33 @@
+---
+layout: post
+title: "Baltimore Crime Data Analysis"
+date: 2023-07-20 13:32:20 +0300
+description: Statistical analysis of Crimes in Baltimore done on Kaggles. Here I show you a summary and a link to my kaggle account. I also included a Python/Jupyter notes PDF file. # Add post description (optional)
+#url:
+img: crimeBaltimore.png # Add image post (optional)
+fig-caption:
+tags: [crime, data analysis, statistical analysis]
+---
+This is a statistical analysis of Crimes in Baltimore done on Kaggles. Here I show you a summary and a link to my kaggle account. I also included a Python/Jupyter notes PDF file.
+
+
+In this Notebook I will analyse "Part1_Crime_Data.csv" dataset taken from Data Baltimore cityT. This dataset represents the location and characteristics of major (Part 1) crime against persons such as homicide, shooting, robbery, aggravated assault etc. within the City of Baltimore. Data is updated weekly. This is an exploratory analysis.
+The data was last updated May 17, 2023, the original csv file contains 565,726 records and 20 columns. Attributes (columns) : CCNO, CrimeDateTime, Location, Description, Inside_Outside, Weapon, Post, Gender, Age, Race, Ethnicity, District, Neighborhood, Latitude, Longitude, Geolocation, Premise, Total_incidents,
+
+Here below is the link to my kaggle analysis :
+CRIMES IN BALTIMORE KAGGLE PROJECT
+
+
+## Crimes in Baltimore conclusion analysis
+
+Baltimore dataset contains data starting from the 1960's, however the entries don't seem consistent (only a few in a total of half a million). The Data becomes more consistent from year 2012, however data is incomplete for 2023 (since the year isn't finished). Therefore the analysis is from 2012 to 2013.
+Baltimore crime data shows that specific types of crimes are more 'popular' regardless of the year, namely Larceny, Common Assault and Burglary. While others are less 'popular' regardless of the year, namely Homicide, Rape and Arson. Larceny and Larceny from auto both show a downward trend. Aggregated assault and homicide seem to follow the same upward trend. Robbery and rape both reached a peak in 2017. Shooting increased sharply from 2012 to 2015, then from 2015 it steadily goes up.
+Frankford is the city with the highest crime level while the district with the highest level of crime is southeast. However, when we look at the heatmap, no particular city or district stands out. From the above analysis we find that Larcency, common assault and Agg. Assault are the 3 most common crimes around the most dense crime location (based on latitude and longitude).
+When it comes to the average time when crimes where pertpetuated, we see that it varies depending on the year. The only pattern noticeable is that crimes tend to happen between the afternoon (from 15h) to midnight.
+I performed a simple regression with the years as the dependent value and number of crimes per type of crime as the independant value. I then predicted the number of crimes for 2023 and compared the results with the 2023 data we had previously (by doubling the number of crimes for 2023). I concluded that the results are off, and that a deeper analysis should be done if we want to forecast the number of crimes (ex.: use of time series).
+I also checked if race, age or gender has an impact on the type of crime by performing a chi2_contigency test and concluded it does. Further analysis would need to be done to see what are exactly the differences.
+
+
+Below is a PDF version of my kaggle project
+
+
\ No newline at end of file
diff --git a/_posts/2023-07-20-tableau-project.md b/_posts/2023-07-20-tableau-project.md
new file mode 100644
index 0000000000..d058ba2824
--- /dev/null
+++ b/_posts/2023-07-20-tableau-project.md
@@ -0,0 +1,18 @@
+---
+layout: post
+title: Tableau - Prime Amazon Analysis
+date: 2023-07-20 13:32:20 +0300
+description: This will redirect you to my Tableau Dashboard - The dataset is taken from Kaggle # Add post description (optional)
+#url:
+img: tableau.png # Add image post (optional)
+fig-caption: This will redirect you to my Tableau Dashboard - The dataset is taken from Kaggle
+tags: [Tableau, Data analysis, graphs, Dashboard]
+---
+
+This will redirect you to my Tableau Dashboard - The dataset is taken from Kaggle
+
+## See link below
+
+
+Tableau - Prime Amazon Analysis
+
diff --git a/assets/img/CV-Maricarmen-ArenasL-BA-DA.pdf b/assets/img/CV-Maricarmen-ArenasL-BA-DA.pdf
new file mode 100644
index 0000000000..6d6a0df3a7
Binary files /dev/null and b/assets/img/CV-Maricarmen-ArenasL-BA-DA.pdf differ
diff --git a/assets/img/MaricarmenArenasTravail-session-eco8620.pdf b/assets/img/MaricarmenArenasTravail-session-eco8620.pdf
new file mode 100644
index 0000000000..61c902af3a
Binary files /dev/null and b/assets/img/MaricarmenArenasTravail-session-eco8620.pdf differ
diff --git a/assets/img/Maricarmen_Thesis__Copy__with_articles-4-ENG.pdf b/assets/img/Maricarmen_Thesis__Copy__with_articles-4-ENG.pdf
new file mode 100644
index 0000000000..6063f5615c
Binary files /dev/null and b/assets/img/Maricarmen_Thesis__Copy__with_articles-4-ENG.pdf differ
diff --git "a/assets/img/R\303\251forme syst\303\250me sant\303\251 en allemagne.pptx" "b/assets/img/R\303\251forme syst\303\250me sant\303\251 en allemagne.pptx"
new file mode 100644
index 0000000000..87233db008
Binary files /dev/null and "b/assets/img/R\303\251forme syst\303\250me sant\303\251 en allemagne.pptx" differ
diff --git a/assets/img/Synthesis-Activity-healthcare.pdf b/assets/img/Synthesis-Activity-healthcare.pdf
new file mode 100644
index 0000000000..4619cdf24f
Binary files /dev/null and b/assets/img/Synthesis-Activity-healthcare.pdf differ
diff --git a/assets/img/Untitled-2-Recovered.psd b/assets/img/Untitled-2-Recovered.psd
new file mode 100644
index 0000000000..322b56671d
Binary files /dev/null and b/assets/img/Untitled-2-Recovered.psd differ
diff --git a/assets/img/Untitled-3-Recovered.psd b/assets/img/Untitled-3-Recovered.psd
new file mode 100644
index 0000000000..fa883b7290
Binary files /dev/null and b/assets/img/Untitled-3-Recovered.psd differ
diff --git a/assets/img/Untitled-design-1-.png b/assets/img/Untitled-design-1-.png
new file mode 100644
index 0000000000..08ffa51295
Binary files /dev/null and b/assets/img/Untitled-design-1-.png differ
diff --git a/assets/img/automation.jpg b/assets/img/automation.jpg
new file mode 100644
index 0000000000..8056b47c81
Binary files /dev/null and b/assets/img/automation.jpg differ
diff --git a/assets/img/bA.jpeg b/assets/img/bA.jpeg
new file mode 100644
index 0000000000..eff6d83a96
Binary files /dev/null and b/assets/img/bA.jpeg differ
diff --git a/assets/img/business-intelligence-bi-key-performance-indicator-kpi-analysis-dashboard-transparent-blurred-background-130811027.webp b/assets/img/business-intelligence-bi-key-performance-indicator-kpi-analysis-dashboard-transparent-blurred-background-130811027.webp
new file mode 100644
index 0000000000..0bcbe7a3e2
Binary files /dev/null and b/assets/img/business-intelligence-bi-key-performance-indicator-kpi-analysis-dashboard-transparent-blurred-background-130811027.webp differ
diff --git a/assets/img/coding.jpg b/assets/img/coding.jpg
new file mode 100644
index 0000000000..67a48ebc47
Binary files /dev/null and b/assets/img/coding.jpg differ
diff --git a/assets/img/crimeBaltimore.png b/assets/img/crimeBaltimore.png
new file mode 100644
index 0000000000..de05599a96
Binary files /dev/null and b/assets/img/crimeBaltimore.png differ
diff --git a/assets/img/data-analysis-of-crimes-in-baltimore5.pdf b/assets/img/data-analysis-of-crimes-in-baltimore5.pdf
new file mode 100644
index 0000000000..69c77a9295
Binary files /dev/null and b/assets/img/data-analysis-of-crimes-in-baltimore5.pdf differ
diff --git a/assets/img/favicon/apple-touch-icon-144x144.png b/assets/img/favicon/apple-touch-icon-144x144.png
deleted file mode 100644
index 7b6dc51976..0000000000
Binary files a/assets/img/favicon/apple-touch-icon-144x144.png and /dev/null differ
diff --git a/assets/img/favicon/apple-touch-icon-72x72.png b/assets/img/favicon/apple-touch-icon-72x72.png
deleted file mode 100644
index 36d90e1572..0000000000
Binary files a/assets/img/favicon/apple-touch-icon-72x72.png and /dev/null differ
diff --git a/assets/img/favicon/apple-touch-icon.png b/assets/img/favicon/apple-touch-icon.png
deleted file mode 100644
index 34fee7b467..0000000000
Binary files a/assets/img/favicon/apple-touch-icon.png and /dev/null differ
diff --git a/assets/img/favicon/favicon.ico b/assets/img/favicon/favicon.ico
deleted file mode 100644
index a16b6fc27e..0000000000
Binary files a/assets/img/favicon/favicon.ico and /dev/null differ
diff --git a/assets/img/favicon/me3.png b/assets/img/favicon/me3.png
new file mode 100644
index 0000000000..e40989a767
Binary files /dev/null and b/assets/img/favicon/me3.png differ
diff --git a/assets/img/financial-stocks.jpg b/assets/img/financial-stocks.jpg
new file mode 100644
index 0000000000..c41e79beea
Binary files /dev/null and b/assets/img/financial-stocks.jpg differ
diff --git a/assets/img/health.jpg b/assets/img/health.jpg
new file mode 100644
index 0000000000..9ffc40dd41
Binary files /dev/null and b/assets/img/health.jpg differ
diff --git a/assets/img/lectures-dirigees-document-version finale-Maricarmen Arenas.pdf b/assets/img/lectures-dirigees-document-version finale-Maricarmen Arenas.pdf
new file mode 100644
index 0000000000..14329126ab
Binary files /dev/null and b/assets/img/lectures-dirigees-document-version finale-Maricarmen Arenas.pdf differ
diff --git a/assets/img/mainimage.png b/assets/img/mainimage.png
new file mode 100644
index 0000000000..656d1fe5d9
Binary files /dev/null and b/assets/img/mainimage.png differ
diff --git a/assets/img/me.jpg b/assets/img/me.jpg
new file mode 100644
index 0000000000..02cea1da2c
Binary files /dev/null and b/assets/img/me.jpg differ
diff --git a/assets/img/me2.png b/assets/img/me2.png
new file mode 100644
index 0000000000..78df2eca22
Binary files /dev/null and b/assets/img/me2.png differ
diff --git a/assets/img/me2.psd b/assets/img/me2.psd
new file mode 100644
index 0000000000..65ef544f4a
Binary files /dev/null and b/assets/img/me2.psd differ
diff --git a/assets/img/me3.png b/assets/img/me3.png
new file mode 100644
index 0000000000..e40989a767
Binary files /dev/null and b/assets/img/me3.png differ
diff --git a/assets/img/retirement.png b/assets/img/retirement.png
new file mode 100644
index 0000000000..bf44dcd4e5
Binary files /dev/null and b/assets/img/retirement.png differ
diff --git a/assets/img/sql.png b/assets/img/sql.png
new file mode 100644
index 0000000000..be90fe0b1c
Binary files /dev/null and b/assets/img/sql.png differ
diff --git a/assets/img/table2_3.png b/assets/img/table2_3.png
new file mode 100644
index 0000000000..dd16bfd3df
Binary files /dev/null and b/assets/img/table2_3.png differ
diff --git a/assets/img/tableau.png b/assets/img/tableau.png
new file mode 100644
index 0000000000..b3667d293a
Binary files /dev/null and b/assets/img/tableau.png differ
diff --git a/assets/img/thesis.png b/assets/img/thesis.png
new file mode 100644
index 0000000000..795525af85
Binary files /dev/null and b/assets/img/thesis.png differ
diff --git a/assets/img/twitter-api.jpg b/assets/img/twitter-api.jpg
new file mode 100644
index 0000000000..c010e6eebe
Binary files /dev/null and b/assets/img/twitter-api.jpg differ
diff --git a/assets/img/vbaexcel.jpg b/assets/img/vbaexcel.jpg
new file mode 100644
index 0000000000..022289ae8d
Binary files /dev/null and b/assets/img/vbaexcel.jpg differ