Introduction From the time when the Great Recession ended, the US economy has steadily grown accumulating up to 8.1 million opportunities; subsequently, collapsing the unemployment rate from 10% to 6.2%. Despite not reaching its original condition, economic specialists posit that the US labour market is now healthy and likely to achieve its former glory in future. This is attributed to the tactical strategies placed by the federal reserves. One of the prioritised measures in the US Federal Reserve is the minimum wage strategy. Minimum wage Using sufficient literatures that analyse minimum wage, various authors differ on the extent to which a minimum wage impacts on employment (Abraham and Katz 1986, p. 509). This makes the study on the impact of minimum wage on employment common as well as most controversial in the labour economics field. However, basing the reasoning on both theoretical and econometric principles, it is arguable that the outcome of wage floor should be noticeable in new employment growth than it is in employment level. According to Raise the Wage (2014), President Barack Obama requested the Congress to consider raising the countrywide minimum wage to $10.10 from the usual $7.25 an hour, as well as signed into law the Executive Order to increase the least earning to $10.10 for the persons employed in fresh federal deals. Markedly, increasing the national minimum wage is meant to raise the earnings of many workers, as well as to enhance the level of business operations in the US (Raise the Wage 2014). Most states have the minimum wage at $7.25 on average, with slight positive deviations in a few states. Substantiating by data analysis of US aggregate employment metrics for the employers’ population in the US, the American job growth radically declined in response to the rise in minimum wage (Abraham and Katz 1986, p. 515). Nonetheless, the data does not show an equivalent cutback in level of employment. Therefore, this illogical effect on employment level is neither a surprise nor a perfect reflection of the effect of minimum wage. In addition, analysing the negative impact on net job growth establishes that minimum wage is majorly determined by downsize in job creation rather than by rise in job destruction (Raise the Wage 2014). Therefore, for the US data analysis of the minimum wage, the alterations in the minimum wage have tremendously affected the change in the number of jobs in the economy rather than the turnover for the individuals within existing jobs (Autor 2011, p. 11). Get your 100% original paper on any topic done in as little as 3 hours Learn More Wage inequality The pronounced inequality of wage as the US data statistics for the last three decades point out can be attributed to the long shift in labour demand in U S. This is ascribed to divergence of employment opportunities across occupations, where employment growth is centred on skilfulness (Raise the Wage 2014). Just like in other countries, empirical research shows that high skilled personnel receive lofty wages, while the low skilled receive meagre wages in the US labour market. According to the 2008 census survey, there is a twist in distribution of employment that transverses occupation over the last three decades as opposed to a uniform rise in the previous decades. Krueger et al. (2014, p. 234) ascertain that in the last decade, there was a high increase in the growth of low-skill jobs because of lower education level in the previous decade. Outstandingly, this employment pattern had great impact on the wage growth. Ultimately, as low skill jobs were characterised with low wages, the polarisation of employment at the lower quadrant reflected on low wages for majority of the US population. In contrast, the inadequacy of skilled labourers increased their demand, thereby reflecting a higher wage for the skilled jobs. Subsequently, the wage gain increased for the few skilful individuals at the upper quadrant. At the same time, the abundance of unskilled workers led to a drastic drop of wage at the lower quadrant. Given that the US labour market is driven by the skilfulness factor due to the technological advancements, the wage inequality grew wider with the skilled persons gaining more wealth against their unskilled counterparts who earned too little (Autor 2011, p. 12). At the same time, the middle class blue-collar jobs drastically disappeared. Considering the available data, the Great Recession has reinforced this US labour market tendency of polarisation in the skilfulness, where high skilled pocket high wage as the unskilled receive low wage rather than redirecting them or reversing these trends (Borjas 2013, p. 36). Short and Long Term Unemployment Determining the unemployment rate could be determined by different criteria, namely U-3 or U-6. According to economists, U-3 is restrictive; it is inclusive of the individuals actively searching for employment, but cannot find (Autor 2011, p. 13). The other measure, U-6 method, is possibly the most complete criterion in determining the unemployment rate (Autor 2011, p. 13). We will write a custom Essay on An Analysis of the United States’ Labour Market specifically for you! Get your first paper with 15% OFF Learn More This measure take in consideration the marginally attached workers, individuals who looked for work in the recent past even if not actively re-engaged in job search at present, as well as individuals employed on part time basis, but would prefer full time employments (Borjas 2013, p. 83). Presently several literatures assert that a number of economic observers prophesy the current conventional Phillips curve and Beveridge curve models as putting emphasis on massive price deflations, limited vacancies, and great wage decline, which are all a result of high rate of unemployment witnessed during the Great Recession. Notably, some of the economists explain the missed Phillips curve based on changes in price increase as well as interactions (Krueger et al. 2014, p. 235). On the other hand, other economists insist that Phillips’ curve price wage is stable only if the short-term unemployment is used, instead of using the total unemployment rate. However, neither of the explanations suggests that long-term unemployment is on the edge of labour work. Instead, the demand and supply side effect of the long-term unemployment is possibly an approval that supports each other instead of completing the explanation (Krueger et al. 2014, p. 235). This is because the statistical discrimination towards long-term unemployment could possibly discourage the individuals (Autor 2011, p. 16). Nonetheless, from this analysis, it is obvious that the long-term unemployed are to a lesser percentage connected to the economy as compared to the short-term unemployed. Likewise, the data, posit that the long-term unemployed are likely to pull out from the labour force than the short-term unemployed (Autor 2011, p. 16). Krueger, Cramer, and Cho (2014, p. 229) affirm that in its frontward direction, the federal reserves have switched the attention from a quantitative unemployment threshold to advanced and wide-ranging measures of the labour market. In relevance to the econometric theory that calls for an alteration in response to a consequence, the US policymakers have improved the deteriorating situations that resulted from the Great Recession (Autor 2011, p. 17). Profiling both long-term and short-term unemployment The statistical data on unemployment in the US suggest that for majority of the decades before the Great Recession, the ratio of the unemployed individuals in US revolved between 10% and 20% (Krueger et al. 2014, p. 247). After the Great Recession, the long-term unemployed population escalated to an average of 40%. This means that the long-term unemployed in the US economy has greater strain at the present economy than ever before (Krueger et al. 2014, p. 248). Summed together, the population of both the long-term unemployed and the short-term unemployed compared to the employed, it is notable that the majority of unemployed are younger, well educated, and unmarried. Not sure if you can write a paper on An Analysis of the United States’ Labour Market by yourself? We can help you for only $16.05 $11/page Learn More In relation to occupation and education level, the mismatch between workers and the kind of duties they undertake are almost similar (Krueger et al. 2014, p. 248). In addition, both seems to have equivalent qualifications, thus any structural problem that leads to long-term unemployment could be due to lack of motivation, self-esteem, or negative attitude of employers who considers long-term unemployment to have knowledge erosion. Duration of unemployment Analysing the US unemployment data for the past three decades, it is arguable that there are three possibilities for alteration of the unemployment statistics in relation to duration of the unemployment. In the first scenario, the short-term unemployed have greater chances to transition into long-term employed than the long-term unemployed (Borjas 2013, p. 116). In the second category, over the entire three decades data, the long-term unemployed are at almost the same range; they are prone to quit the labour force as compared to the short-term unemployed. Lastly, in this scenario, the three decades data imply a drastic crash in labour force depart around recessions for the long-term unemployed than short-term unemployment. Similarly, the data indicate a major fall in job finding measures around recession in the short-term unemployment than in the long-term unemployment (Borjas 2013, p. 127). Work trends survey for unemployed Another important aspect in analysing unemployment is the relationship of work trends to the transition rate of unemployed. Against the expectation of many, comparing the short-term unemployment and the long-term unemployment, the gap between the two categories is quite bigger (Autor 2011, p. 18). The work trend survey results indicate that the possibility for long-term employed to get jobs is lower than the possibility for the short-term unemployed to get both full time and part time jobs (Autor 2011, p. 18). Regional differences of unemployed within the United States of America The recent data indicates that while some of the states in the US have fully or partially recovered from the Great Recession, some states are far long behind in the recovery process. This is an indication of the possibility of irregular unemployment due to the economic differences. From the analysis of these data, it is debatable that long-term unemployment is unpredictable even for states that have low level of unemployment (Autor 2011, p. 18). However, economic factors such as the boom in the energy production in some state like Alaska, Iowa, West Virginia amongst others States have great effect on the general unemployment population (Abraham and Katz 1986, p. 520). The situation is complicated by a twist of scenario. First, in the areas with stronger economy, especially in the energy producing regions, the firms are likely to absorb higher number of workers, hence lowering the unemployment rate. Alternatively, due to the strong economy, there is the likelihood for majority of the employees to get sustainable income, and, as a result, the long-term unemployed would most probably withdraw from the labour market (Krueger et al. 2014, p. 259). This could subsequently lower the general unemployment level. The second scenario is that the presence of stronger economies could be indications of a likelihood of getting unemployment. Therefore, even the long-term unemployed might not give up, but instead keep the job hunt (Abraham and Katz 1986, p. 521). This could defiantly reflect into a massive number of unemployed in such regions. Calibration model Statistical evidence designates that after the Great Recession, the vacancies and unemployment link known as Berveridge curve curled outwards, as most of the vacancies than predicted were reserved for the high unemployment rate (Krueger et al. 2014, p. 258). Notably, this connection is firm when short-term unemployment rate is used. This could be possible when the Beveridges curve shifts outwards after the rigorous shock because of slow job growth, an increase in long-term unemployment, a decrease in the entire match effectiveness, as well as a reduction in the number of individuals quitting the labour force (Borjas 2013, p. 169). This is more particular to the long-term unemployed. The unemployment and vacancies path can possibly relax back to the initial Beveridge curve position because of withdrawal of the long-term unemployed from workforce. Gender discrimination in the US labour market An analysis of the recent gender employment pattern signifies the occupational distinction based on gender witnessed in the US labour market. Initially, the blue-collar jobs that include crafts and operations were reserved for the male gender, while the female were distinguished with clerical occupations (Krueger et al. 2014, p. 258). The situation is totally different from the current record. Even though, the contrast in traditional gender domination of occupations is visible, little is known about the cause. However, some labour market scholars believe that this employment disparity is a result of gender differences in job choice. Alternatively, the disparity in the occupancy based on gender could be because of differences in characteristics of the US labour force, such as occupational segregation. Occupation segregation is the exclusion of workers from certain professions while dominating other occupations. Autor (2011, p. 12) states that over the years, researchers have dwelt on the measures and consequences of occupational segregation in the labour market. The changes witnessed in the characteristics of occupation in the US labour market are due to several factors; however, long-term transformation in occupation is core in these changes (Autor 2011, p. 16). In this aspect, the growth in women’s labour force is linked to the rise in the ratio of white-collar jobs in the US labour market. Therefore, as more women join the labour market with some having higher educational level than the men counterparts, they get absorbed in the swiftly rising white-collar jobs in the clerical, professional, as well as technical fields (Autor 2011, p. 17). The ageing population of the US labour market Statically analysis projects the US’s population to increase by 91 million over the next 4 decades from the 309 million of 2010 to 400 million mark by 2050 (Abraham and Katz 1986, p. 508). Even though this growth is anticipated to take place in larger brackets, the entire growth will be resolute in the ageing bracket. In essence, this perception implies that the number of people in the ageing group – at the age of 65 and above – will be more than double. This in fact means that the aged population could increase from 13%, according to the 2010 population, to 21% of the total population in the prospected 2050 population (Abraham and Katz 1986, p. 510). Even though the age bracket of between 20 and 64 that actively engage in workforce labour will also continue to grow, the growth rate in this bracket is much slower as compared to the era when the baby boomer bulge propelled it (Abraham and Katz 1986, p. 511). Therefore, this population of the working force is likely to reduce in size from 60% of 2010 to 55% in 2050. Notably, the labour force participatory rate in the United States of America is recorded to have dropped tremendously since the occurrence of the Great Recession period of 2007 to 2009. The fall is attributed to three main factors. According to Abraham and Katz (1986, p. 510), the effect of cyclical from the Great Recession, the ageing population, and a combination of several other minor factors can explain these transitions. However, of the two identified factors, ageing population to date cater for the better part of the effect. Government policies to increase labour market flexibility Labour market flexibility has different definitions with varied meanings to different people. Whereas in some parts of the world, labour market flexibility means a room for employers to fire employees to reduce wages (Krueger et al. 2014, p. 255). In the US, it is a virtue aimed at empowering employees. In the olden days, low unemployment coupled with edgy labour markets forced several authorities and the policymakers to formulate numerous programmes to ensure that the labour markets are more flexible and effective. Some of these policies aimed at intensifying labour work force. In the US, policies to increase labour market flexibility aimed at extending the service to incorporate the identification of both the long-term and the immediate needs of the labour market (Krueger et al. 2014, p. 258). This included working with the employers to screen and select trainees that would assist in the immediate demand of the US labour market. To achieve this target, the US Federal Government established a professional training centre for the white-collar job opportunity. In addition, in the midst of transformation to technology-based operations, this move aimed at strengthening workers’ training programmes to produce capable graduates for the demanding labour workforce (Krueger et al. 2014, p. 259). Most of these policies aimed at employing the citizens in the job superfluous sectors. While, the original policies at training and including new employees in the industry, the recession later changed the idea to policies that meant to retain the employed for longer duration as possible (Krueger et al. 2014, p. 261). In this effort, the US Federal Government initiated programmes advocating for short-term work, in which workers enjoyed partial unemployment benefits even after reducing their working hours to avoid lay-offs. Equally, the Federal Government provided employers with subsidiaries in order to retain workers who would otherwise had been laid off. In response, the US model of labour market flexibility proved to be the most probably response to the growing unemployment dilemma (Krueger et al. 2014, p. 263). Inward and outward Migration The United States’ labour market is illustrious for outward migration of skilled workers for permanent or temporary work. Similarly, the US’s labour market assimilates a number of outward immigrants who are majorly unskilled workers. According to Borjas (2013, p. 67), immigration has both its merits and demerits; it can drain the country of the skilled labour workforce, thereby impacting on the workforce negatively, and, at the same time, adding to their incomes. Alternatively, importation of unskilled workers in the country increases competition, thus raising the level of unemployment in the country. Although exportation of skilled workers that is dominant in the America’s labour market might lead to brain drain, it is arguable that temporarily employed workers could as well bring with them new ideas to their country upon their return (Borjas 2013, p. 87). These efforts by the Federal Government intended to create policies that could increase labour market flexibility aiming at strengthening the US internal labour market (Krueger et al. 2014, p. 295). Even though some of these policies allow employers to lay off workers, their ultimate goal is to generate additional job opportunities at the expense of high living standards. From this research, it is fair to argue that both the outward and inward migration of the US population has benefit to the US labour market. For the outward migration, this trend eases competition in the local market, thus maintaining the high level of demand for the expertise skills. The trend can maintain the wage gap in the country. Besides, this helps in reducing the unemployment rate in the robust US labour market. For the inward migration, the high number of the unskilled employees increases competition for the unskilled job opportunities (Borjas 2013, p. 90). This helps the Federal Government in maintaining low wage for the unskilled workers, hence impacting on the general labour market benefits. References Abraham, K. and Katz, L.F 1986, ‘Cyclical Unemployment: Sectorial Shifts or Aggregate Disturbances?’, Journal of Political Economy, vol. 94, no. 1, pp. 507-522. Autor, D 2011, ‘The polarization of Job opportunities in the US labour market: Implications for employment and earnings’, Journal of community investment, vol. 23, no. 2, pp. 11-18. Borjas, G. J 2013, Labor economics, McGraw-Hill, New York. Krueger, B., Cramer, J., and Cho, D 2014, ‘Are the long-term unemployed on the margins of labour market?’, Brookings Papers on Economic Activity, vol. 17, no. 9, pp. 229- 302. Raise the Wage 2014. Web.
Crime Prevention Program Adopted by Police Force in Australia: Police in Schools Program
Crime Prevention Program – Police in Schools Program The Police in Schools Program functions as a joint community, social institution and law enforcement agency crime prevention strategy. Although representations of the program vary moderately both nationally and internationally, the program generally involves the permanent placement of a local police officer in a school environment. As a part of the school community, the School Based Police Officer (SBPO) conducts investigations of offences committed during school hours, participates in the development and delivery of educational materials, and liaises with relevant agencies regarding youth welfare (Murphy, 1998). The Police in Schools program operates in a number of Australian schools nationally. A leading objective of the program is to improve the relationship between police and young people by encouraging positive relationships with law enforcement officers, where the SPBO facilitates opportunities for police and young people to interact in an informal and predominantly positive context. The expected outcomes of a successful school based policing program include a reduction in crime both by and against young people; the provision of a safe environment and a community-wide support system for young people which promotes co-operation and care; an increase in students’ knowledge of the law and the function of police in society, and the development of positive relationships between young people, members of the school community, and the police in general (Murphy, 1998). Rationale and Empirical Evaluation of Police in Schools Although there is a distinct absence of police jurisdictions in Australia which have institutionalized community policing as the dominant organizational paradigm (Putt, 2010), a number of police programs and initiatives have been initiated nationally which are strongly associated to the rubric of community policing. Current evidence-based approaches encourage community policing and government youth strategies to provide a more objective and holistic response to policing young people (National Youth Policing Model, 2010). The Police in Schools program introduces adolescents to empirically supported community-police partnership models, anticipated to inspire trust between community members and police officers, encourage respectful behaviour, and explicitly promote collaborative processes between law enforcement officials and the community sector (Murphy, 1998). The program also presents an intervention and prevention strategy during key developmental stages for youths that aim to reduce risk factors and enhance protective factors for crime, which clearly relates to and forms the premise of developmental theories of crime. The current essay will explore and discuss the theoretical rationale which guides the Police in Schools Program, and will critically evaluate the programs’ effectiveness. It will begin by exploring how the developmental perspective of crime and crime prevention offers a theoretical foundation for the program. It will then examine the role of the School Based Police Officer (SBPO) and the effectiveness of program in reducing truancy, increasing collective efficacy (CE) and lessening the frequency of bullying in school grounds as a protective approach against life-course persistent anti-social behaviour. Lastly, it will examine evaluations of effectiveness nationally and internationally and will discuss potential improvements to the current program implementation. The Police in Schools program is theoretically grounded and guided by the developmental perspective of crime. Researchers have argued that understanding both behavioural development and the development of crime and anti-social behaviour across the lifespan require a holistic-interactionist perspective of the synergic relationship between biological, psychological, environmental and cultural determinants (Morizot
What is the difference between a randomized ANOVA and a repeated measures ANOVA?, statistics homework help
essay writer What is the difference between a randomized ANOVA and a repeated measures ANOVA?, statistics homework help.
please explain all solutions2) What is the difference between a randomized ANOVA and a repeated measures ANOVA? What does the term oneway mean with respect to ANOVA? 4) If a researcher decides to use multiple comparisons in a study with three conditions, what is the probability of a Type I error across these comparisons? Use the Bonferroni adjustment to determine the suggested alpha level. 6) When should posthoc comparisons be performed? 8) Why is repeated measures ANOVA statistically more powerful than a randomized ANOVA? 10) In a study of the effects of stress on illness, a researcher taillied the number of colds people contracted during a 6month period as a function of the amount of stress they reported during the same period. There were three stress levels: minimal, moderate, and high stress. The sums of squares appear in the following ANOVA summary table. The mean for each condition and the number of subjects per condition and the number of subjects per condition are also noted. Source df SS MS f Between groups 22.167 Within groups 14.750 Total 36.917 Stress level Mean N minimal 3 4 moderate 4 4 maximum 6 4 A) Complete the ANOVA summary table B) Is F obt significant at a=.05, or at a=.01 C) Perform post hoc comparisons if necessary. D)What conclusions can be drawn from the Fratio and the post hoc comparisons? E) What is the effect size? What does it mean? F) Graph the means. 12) A researcher conducted an experiment on the effects of a new “drug” on depression. The researcher had a control group that received nothing, a placebo group and an experimental group that received the “drug”. A depression inventory that provided a measure of depression on a 50point scale was used (50 indicates that an individual is very high on the depression variable). The ANOVA summary table appears next, along with the mean depressing score for each condition. source df ss ms f between groups 1,202.313 within groups 2,118.00 total 3,320.313 drug condition mean n control 36.26 15 placebo 33.33 15 drug 24.13 15 A. complete the ANOVA summary table b. Is f obt significant at α= .05; at α= .01? c. Perform post hoc comparisons if necessary d. What conclusions can be drawn from the F ration and the post hoc comparisons? e. What is the effect size , and what does this mean? f. Graph the means 14) A researcher has been hired by a pizzeria to determine which type of crust customers prefer. The restaurant offers three types of crust: hand tossed, thick, and thin. Following are the mean number of linch pieces of pizza eaten for each condition from 10 subjects who had the opportunity to eat as many pieces with each type of crust as they desired. The ANOVA summary table also follows Source df ss MS F Subject 2.75 Between 180.05 Error 21.65 Total 204.45 Crust type Mean n Handtossed 2.73 10 Think 4.20 10 thin 8.50 10 a) Complete the ANOVA summary table. b) Is Fobt significant at a = .05; at a = .01? c) Perform post hoc comparisons if necessary. d) What conclusions can be drawn from the Fratio and the post hoc comparisons? e) What is the effect size, and what does this mean? f) Graph the means 2. What is an Fratio? Define all the technical terms in your answer. 3. What is error variance and how is it calculated? 4. Why would anyone ever want more than two (2) levels of an independent variable? 5. If you were doing a study to see if a treatment causes a significant effect, what would it mean if within groups variance was higher than between groups variance? If between groups variance was higher than within groups variance? Explain your answer 6. What is the purpose of a posthoc test with analysis of variance? 7. What is probabilistic equivalence? Why is it important?
What is the difference between a randomized ANOVA and a repeated measures ANOVA?, statistics homework help
Importance of Inductive Reasoning Skills for Various Studies Proposal
Table of Contents Abstract Introduction Literature Review Methods Results Discussion and Conclusion Appendix A: Questionnaire Appendix B: Dataset References Abstract The importance of inductive reasoning skills (IRS) for academic performance a topic of increased interest for the international community of researchers in education. The literature review suggests that inductive reasoning can positively affect the grade point average (GPA) of students of different age groups. However, the current body of research lacks information if such skills are more important in some studies than in others. The present project compares the results of the regression analysis of GPA versus IRS of students studying social sciences and technology students. The results demonstrate that IRS is more important for technology students; however, the results need further confirmation due to limitations of the present research. Introduction The importance of inductive reasoning skills (IRS) for the academic achievements of students is a highly discussed topic in scientific and academic literature. Inductive reasoning is commonly defined as being able to make predictions base on existing knowledge (Hayes
Ethical Issues in Randomized Control Trials
Ethical Issues in Randomized Control Trials. PROBLEM 1 A research team is conducting a Randomized Control Trial of a new drug to treat the common symptom of the Ebola virus (fever) over the past six (6) months. The experimental group consisted of female Ebola patients aged between 30 and 50 years to whom the new drug was administered. The control group consisted of male Ebola patients aged between 70 and 80 years. To this control group placebo, a substance that resembles medicine superficially and is believed by the patient to be medicine but that has no medicinal value was administered. Discuss ethical issues associated with this research design? Ebola virus disease also as (EVD) is a highly infectious and contaminating disease which has recently killed thousands especially in West African. This disease is a severe and most often fatal illness in humans. The Research design used Randomized Control Trial, however it does not state how it was able calculate the RCT sample size. The years in age of women and men are listed but we do not know how many participants the experiment had. This research conducted random sampling, was the randomization truly “random,” or are there really are two populations being studied here. It is very difficult to come up with two randomized age groups of men and women. As stated above Ebola is a very dangerous disease and one of the ethical issues in research is that vulnerable groups should not be used unless benefits outweigh the damages, the age group of men between 70-80yrs fall in that category as elderly. This research design would have been giving this vulnerable group a placebo medicines (dumpy) to this group over a period of 6 months. How many individuals would have been lost to this fatal killer disease? The declaration of Helsinki states that in any medical study, every patient including those of a control group, if any should be assured of the best proven diagnostic and therapeutic method. The controlled group where not assumed of this, placebo control trials are justified when it comes to testing a new product like hair removing creams which has no permanent damage, with severe illness this cannot be acceptable .illness that are fatal and highly contagious when with a placebo control is not justifiable with Ebola because without any medical intervention they will die. The study design also shows large evidence of both allocation and performance bias, in a sense the women that were selection to the intervention group were specifically chosen to be in the intervention group, due to the fact that they will perform better, by facilitating quick and desirable recovery compared to the elderly men whom their bodies cannot respond with the same efficiency. Lastly some of these elderly men are Husbands, Dads, Granddads, brothers of other people countless family will suffer endlessly over 6 months, whilst their relative is not getting any help at all. What modifications would you suggest on the research design in future? There is never a single way to follow when it comes to research, however they are research design which are more suitable and permit the evidence obtained to answer the initial question as explicitly as possible. In future I would use I would use Time series design. This is due to the fact that Time series design allows each participant to receive an intervention over a period of time and results are measured before and after any intervention. Hence reducing the fatality that are associated with Ebola but also making it less contagious if the medication is being effective. Another change I would introduce is to remove the placebo medication, one cannot be comparing a drug efficiency to a dumpy Placebo, and if it is a new medication then it will have to be compared to other similar drugs to assess its effectiveness instead of nothing, whilst humans are dying and others getting contaminated within that 6 months. Another change would be for the research not to use vulnerable elderly as the control group, hence using adolescent and young adult sample population, since they would give a clearer indication to the efficiency of the new drug. Lastly I would calculate a large enough sample size to increase the findings internal and external validity. Due to Ebola being fatal a sample size would help, by clarifying the total fatalities experience but most important the number of patients cured. PROBLEM 2 One of the leading causes of fast spread of HIV and AIDS in Africa is poverty particularly income poverty that forces unmarried women and girls to indulge in prostitution. In January2005, the IMF/World Bank designed a 10-year micro-finance targeting 1000 prostitutes. The aim of the project was to see a significant drop in the number of women or girls who indulge in this malpractice. To be registered as a beneficiary, interested women and girls were required submitted an application and pay a processing fee of MK500. A total of 2500 applications were received at the close of the deadline. To identify project beneficiaries, it was decided that a lottery be conducted and that all applicants be invited to witness the draw. After selecting the beneficiaries (i.e., treated group), a random draw was also conducted to select non-beneficiaries (i.e., control group). Discuss ethical issues associated with this research design? Acquired immune deficiency syndrome (AIDS). AIDS develops from infection with HIV (human immunodeficiency Virus), which attacks the immune system and disables a person’s defenses against other diseases, including infections and certain cancers. This research used Random selection to find the 1000 participants it wanted to induct into the micro-finance program .Firstly the time frame of the project, the project was meant to run for 10 years and considering that this will have been an observation study design a lot of things can happen within the 10 years, which will affect the internal Validity of the findings. Secondly this Research design asked participants to pay for the application ,a sum of K500 which would total up to K1,250,000 million for the 2500 applicants. Research ethics prohibits payments that can potentially cause pressure, bribes and economic social disadvantages. Therefore many individuals who really need the help would have been excluded, subsequently them having inability to source out the K500.It is also unethical asking money from a poverty group who happens to practice prostitution, they will engage in this malpractice to raise that K500, leading to the project not trying to stop prostitution but encouraging it on the other hand. Research designs have to respect the privacy and confidentiality of participants at all times. Conducting a Lottery where everyone is invited and dividing the treatment and control groups of prostitutes in front of a congregation hampers privacy and confidentiality ethics, this sort of exposure can result into the applicants looked down upon by community members. What modifications would you suggest on the research design in future? When it comes to payments, Research ethics promotes that participants should be suitably compensated for any expenses, compensated for effort, time or lost income, and acknowledged for their contribution.in total this research raised K1,250,000 million. The suitability of this money is not justifiable since IMF bank will actually be spending huge amounts monthly to sustain this project. I would remove this application fee so that its open to every suitable candidate without having financial hinders. The applicants were unmarried women and girls, I think that the focus should have been different since these are different age groups. The women can be put on the micro-financing plan whist the girls can be given a different option to continue on with education and the money directed towards their fees. To expand on that it would be better to teach the participants to fish rather than give them fish every time for 10 years, since when the project stops they will go back and continue with their malpractices (prostitution) .However if you can teach some of the women income generating activities and education for the girls then they will be able to become dependent and stop the prostitution altogether. This is known as transformative participatory monitoring and evaluation. Another change that I would install is the time span of the projects 10 years is a lot of time, I would change the Research design to Randomized Control Trial with Crossover design. This would allow all 2500 participants to partake, thus not having any control groups, but time sequence when they would be receiving the money, hence follow ups can be conducted to what the individuals are capable of with and without benefits. Futuristic speaking, I would change the design of the selecting process of the 2 groups, a Lottery selection were everyone is invited would be cancelled and conduct an expert panel to assess economic, social and health status of the applicants. Decisions would be made and beneficiaries awarded to the real needy ones. REFERENCES Kazdin, A.E. (2010).Single-Case Research Designs: Methods for Clinical and Applied Settings, 2nd edition. New York: Oxford University Press. Millum,JEthical Issues in Randomized Control Trials