Project Management Certifications Compared- A Preliminary Comparison

31 Dec 2009 - 05:39 AM under IPMA, Other, PMP, PRINCE2 | Posted by

Guest post from Dr. Paul Giammalvo

INTRODUCTION

Have you ever wondered what professional level credentials are available for project/program managers to choose from, and how these various project management certifications compare against one another?

This was the subject that evolved from a lengthy and sometimes heated debate on one of the Linked In discussion pages relating to Project Management. While the original topic was ?PMP: Does it assure you a job??[1] the debate quickly got off topic and centered around the relative value of the various credentials.

What became clear is that while everyone THINKS (or at least would like to believe) the credential they hold is the most valid and appropriate measure of project management knowledge, skills and competency, a quick on line review of published literature showed little or no peer reviewed research on this topic to provide any guidance or insight.

Another problem in making any comparison is while nearly all of the major professional organizations offer multiple levels of credentials, one cannot tell from the names of those credentials exactly what they represent vis a vis one another. (i.e. does holding the Project Management Professional (PMP) from PMI really mean that the holder is a professional project manager?) Or what is a ?Certified Cost Engineer? and how does that relate to project or program management?

This lack of any meaningful comparison was the driving force behind this exploratory research effort.

Please note. This experiment is NOT intended to be a definitive piece of research. There are many assumptions which have been incorporated into the calculations which may or may not be valid. The sole and only purposes were/are:

1) to see if it was feasible to produce a meaningful ratio scale against which to rank order and compare the relative standings of the various credentials from information available on the internet, and;

2) to generate sufficient interest and debate for others to carry this research forward in a more academically sound and rigorous manner.

Ideally, this will be seen as a challenge- a trigger for all concerned practitioners; the professional organizations who purport to represent them and those companies and agencies who employ or contract the professional services of the holders of these credentials, to support the creation of an INDEPENDENT testing and validation organization (such as the Global Alliance for Project Performance Standards- GAPPS[2]?) who can create and maintain an evaluation standard. A ?Consumer Reports? or ?Underwriters Laboratory? of project/program management related certifications and credentials.

SELECTING THE CREDENTIALS TO COMPARE

The first step was to identify all those credentials which are generally globally recognized and which are advertised or otherwise positioned as attesting to knowledge, skills, attitudes, strengths or competency in project, program or portfolio management.

The following organizations were selected as they are generally recognized around the world. (in alphabetical order)

  • American Society for the Advancement of Project Management (asapm, as an IPMA Member)
  • Association for the Advancement of Cost Engineering International (AACE)
  • Australian Institute of Project Management (AIPM, also a member of IPMA)
  • International Council of Systems Engineers (INCOSE)
  • OGC/APM’s PRINCE2
  • Project Management Institute (PMI)

This list is NOT all inclusive, nor was it intended to be, but it was felt that it represents the more commonly recognized credentials in the field of project/program management.

As the International Project Management Association (IPMA) is an umbrella organization comprised of organizations from individual countries, the Australian Institute of Project Management (AIPM) credentials along with those from the relatively new American Society for the Advancement of Project Management (asapm) were selected as being representative of those offered by other IPMA organizations. For the purposes of this paper, the asapm ?project associate? and the AIPM CPPP are roughly equivalent to the IPMA level D; the asapm ?project manager? and AIPM?s Certified Practicing Project Manager (CPPM) are roughly equivalent to IPMA level C; asapm? s ?senior project manager? is roughly equal to IPMA?s level B (AIPM seems not to have a certification at this level yet) and asapm? s ?program manager?, ?portfolio manager? and ?project director? are all IPMA level A, as is AIPM?s Certified Practicing Project Director (CPPD)

This limited research also didn?t address other specialty certifications, such as the Construction Management Associations Certified Construction Manager (CCM) credential, although for follow on research it would be interesting to add those as well. Also interesting would be to compare and benchmark the various project and program management credentials against existing licensing requirements, such as the North American Professional Engineer (PE) licensing process or against other professions, such as the CPA or various Medical Board certification processes.

DEVELOPING THE RATING CRITERIA

A review of the published facts from each of the websites indicated a fair degree of consistency in the information provided by each organization with regards to their credentials. While not always easy to find, the information normally and customarily included on the websites could be categorized into the following 27 general topics or headings:

  1. Name and contact information of the Developing Organization
  2. Certification Name
  3. Certification Acronym
  4. Date Certification Initiated or started
  5. General Description
  6. Process to get certified
  7. Does the certification require experience and if yes, how many hours?
  8. Does the certification require a degree or can experience be substituted in lieu of a degree?
  9. Is the credential exam based only, peer reviewed only or both;
  10. If exam based, the duration of the exam;
  11. Number of questions on the exam
  12. Type of questions;
  13. Passing score or grade;
  14. Cost of Exam Members
  15. Cost of Exam Non Members
  16. Membership Cost
  17. Cost Comparison- Better to join or not to join?
  18. Books REQUIRED to pass the exam? (if any)
  19. Cost of the REQUIRED books? (if any)
  20. Course(s) required prior to sitting for the exam?
  21. Number of hours training required prior to sitting for the exam?
  22. Is there a paper required in addition to the exam?
  23. How long is the certification valid?
  24. Renewal Requirements
  25. Renewal Costs
  26. URL for more information on the organization or credential
  27. URL for Training/Other information

The attached Excel spreadsheet[16] contains a summary of the data gleaned from the websites. Please note that while reasonable attempts were made to validate the information, including sending the file to responsible individuals active in these organizations, no formal request was made to the organizations themselves to validate or clarify it. Also, the information was last checked on 22 December, 2009, and may well have changed depending on when this paper is actually read.

As can be appreciated, there is precious little data contained in the various organizational websites to enable any meaningful comparison to be made between the credentials. Whether intentional or not, practitioners and employing organizations alike should consider insisting on their rights as consumers, expecting the professional organization?s to provide sufficient data to enable the potential seekers of their certifications or those who use the people who are certified by them, to make a fair and rational evaluation. This topic will be addressed more fully in the recommendations.

METHODOLOGY

Given the paucity of useable publicly available information, which, if any, of the 27 attributes readily and publicly accessible could possibly serve as the basis for a nose to nose and toes to toes evaluation between credentials?

Prior research[3] by this author indicated that

?An attribute or trait common to nearly all definitions of a profession is the expectation that

Professions require a ?long? period of education and training. Based on the literature research, this attribute is often broken down into two parts: 1) formal education, usually at minimum four years beyond high school, but often longer. Polelle (1999) in particular identified several US State Supreme Court decisions establishing a four year education as one of the ?bright line? tests that jurists use to determine whether an occupation is or is not a profession. 2) Some form of supervised, ?hands on? training, apprenticeship, internship or experience-based element, designed to build competency.?

However, as indicated by Pierce v AALL Insurance (1988)[4] and by Garden v Frier (1992[5]) the Florida Supreme Court felt that apprenticeship alone without a four year degree did not qualify as being a profession. (Polelle, 1999). Another US Supreme Court ruling from North Dakota (Jilek v Berger Electric)[6] also differentiated the trades from being professions based on the fact that although they required a license to practice, not requiring a degree did not qualify them as a profession. (See Recommendations for more on this issue)

Based largely on this research, when the data obtained from the websites was compared against the attributes of a profession, it was clear that of the 27 possible pieces of information provided by the organizations, the only four of which made any sense were the:

1) work experience requirements;

2) formal educational requirements

3) testing for knowledge and/or

4) assessment process used to determine competency.

As all four of these variables are readily available from the various organizational websites, and appropriate to use as a measure or infer the relative professional ?strength? of one organization?s credential compared to another, this is what the exploratory or preliminary model was based on.

Hours of work experience required for those who already hold a Bachelor?s degree (WEXP)- This information was readily available from all the certification websites and/or from the downloadable .pdf files. Here we find that not all organizations use the same number of working hours. For the purposes of this research, a working year consisted of working 40 hours per week, 50 weeks per year. As project managers rarely work a 40 hour week, holidays and sick days were assumed to be offset by the overtime hours.

Standardized Value of a Bachelors Degree (BDEG)- Because there is little or no apparent consistency in how different professional organizations equate the value of a degree to the equivalent work experience, a web based research was conducted and using undergraduate degrees in construction project management from Purdue, University of Houston and University of Florida and the general Project Management degree from Colorado Technical University yielded an AVERAGE of 130 credit hours required to earn a Bachelors degree. BDEG was broken down further into two components:

Actual Time Spent In Class- To calculate this value, it was assumed that for each 3 credit hours awarded, that 40 hours of face to face classroom time was required. Thus for a 130 hour degree program: 125/3 = 41.67 individual courses X 40 hours per 3 credit course = 1,668 hours spent in a classroom.

Level of Effort expected by the student OUTSIDE of class time- Following the same approach, it was generally agreed that at the undergraduate level, for each hour spent in the classroom, a MINIMUM of 2 hours needed to be spent by a student in doing homework, writing research papers and taking exams: 1,668 classroom hours X 2 = 3,334 hours of student effort outside of class.

Thus to calculate the BDEG (Standardized Value of a Bachelors Degree) we add the class time plus the out of class time 1,667 + 3,333 = 5,000 hours of learning experience.

Standardized Value of a Masters Degree (MDEG)- Despite a clear trend by other professional organizations (representing Nursing, Civil Engineering, Social Work) in requiring a Masters degree, at least for their top level credentials, this too was a consideration. Worth noting that of all the credentials evaluated, only AACE?s Certified Portfolio, Program and Project Manager (C3PM) required a Masters Degree or equivalent educational credits to earn that certification.

As was done with the BDEG, a web based research was conducted and using Master of Science degree in project management from George Washington University, Boston College, Western Carolina and Stevens Institute of Technology, yielded an AVERAGE of 36 credit hours required to earn the Master of Science degree. MDEG was broken down further into two components:

Actual Time Spent In Class- To calculate this value, it was assumed that for each 3 credit hours awarded, that 40 hours of face to face classroom time was required. Thus for a 36 hour degree program: 36/3 = 12 individual courses X 40 hours per 3 credit course = 480 hours spent in a classroom.

Level of Effort expected by the student OUTSIDE of class time- Following the same approach, it was generally agreed that at the graduate level, for each hour spent in the classroom, a MINIMUM of 3 hours needed to be spent by a student in doing homework, writing research papers, working on group projects and taking exams: 480 classroom hours X 3 = 1,440 hours of effort outside of class.

Thus to calculate the MDEG (Standardized Value of a Bachelors Degree) we add the class time plus the out of class time 480 + 1,440 = 1,920

The next category was any additional hours of REQUIRED training in order to sit for the Exam (ARTH)- As PMI is the only organization to REQUIRE training precedent to taking the exam, considerable debate ensued about how best to score this. Because the 35 hours of training can be fulfilled by studying books of sample exam questions or listening to podcasts and does NOT require the rigor equivalent to an undergraduate degree course, those hours were counted at face value, (35 hours required, 35 hours counted under ARTH) with no additional hours to cover outside or additional study. (See EXAM calculations below for more)

The next component of the scoring model was how to score the exams (EXAM)- As done previously, there are two components to this-

Time spent actually taking the exam, which came from the information published on each organizations web pages and/or downloads for their respective exam and;

Level of Effort spent by the individual in studying for and preparing to sit for the exam. To calculate this value, a general consensus was reached that for each hour of exam, 30 hours of preparatory time was required. (Admittedly, this was based almost entirely on PMP and CCC/E experiences) Using the PMP exam as an example, the exam itself is 4 hours long therefore, 4 X 30 = 120 hours of total effort to prepare for and passing the exam. HOWEVER, at least in the case of PMI?s PMP, part of that level of effort is the 35 hours of required course work, so to avoid double counting those hours; we deduct 35 from 120, yielding an EXAM score of 85 for the PMP.

The final challenge was how to weight the Total Level of Effort required for the Assessment processes (ATCA)- Following the processes developed previously, we broke the ATCA score into two elements-

How many PAID/VOLUNTEER person hours were required to CONDUCT the Assessments- this information was readily available on most of the sites. While nearly all assessments required between 2-5 hours of assessor time, the key variable was how many people were required to conduct an assessment. To calculate this value, we multiplied the number of PAID/VOLUNTEER assessors required X the average time it takes to conduct an assessment. Using the asapm (roughly equivalent to the IPMA Level B) they use 2 assessors conducting an interview for 2 hours. 2 people X 2 hours = 4 person hours of effort.

How many hours of PREPARATION time did it take a person being assessed to prepare the required documentation, submit it to the assessors then participate in the assessment process? Lacking any empirical evidence, an ASSUMPTION was made that for every man hour of assessment that is required, the person being assessed would need to spend 10 hours preparing.

Worth noting is that while all the competency based credentials required assessments, only two- AACE?s C3PM and PMI?s PgMP- required that the applicant being assessed submit 360 degree evaluations from supervisors, peers, subordinates and customers. AACE accomplished this by requiring a log book, similar to those maintained by commercial pilots or SCUBA divers while PMI required that the applicant submit signed copies of the assessment surveys.

To summarize the scoring model let:

Total Hours of Work Experience for a person WITH a 4 year degree = WEXP[7]

Standardized Value of a 4 year Degree = BDEG[8]

Standardized Value of a Masters Degree = MDEG[9]

Additional REQUIRED Training Hours = ARTH[10]

Total Level of Effort to prepare for and take the exams = EXAM[11]

Total Level of Effort required to prepare for and be assessed = ATCA [12]

Total Level of Effort and Degree Requirements Professional Score = PSCOR

Then the WEXP + BDEG+ MDEG + ARTH + EXAM + ATCA = PSCOR, where the PSCOR is equal to the cumulative calculated value of all the variables.

To help readers understand the scoring model; I will illustrate using how two of the credentials were scored. PMI?s PMP as it is the most ubiquitous and the top ranked C3PM by AACE, with the understanding that exactly the same set of calculations was performed for the other credentials as well.

PMI?s PMP Relative Professional Score Calculations (See attached Excel spreadsheet[16], Sheet 2, Column R)

WEXP = 4500

BDEG = 5000

MDEG = 0

ARTH = 35

EXAM = 89

ATCA = 0

PSCOR = 9624

AACE?s C3PM Relative Professional Score Calculations (See Excel spreadsheet[16], Sheet 2, Column B)

WEXP = 16000

BDEG = 5000

MDEG = 1920

ARTH = 0

EXAM = 434

ATCA = 132

PSCOR = 23486

This same formula was applied equally to all the credentials. (see attached Excel spreadsheet[16], sheet 2)

Now, is this model perfect? No, of course not, nor was it expected to be. To reiterate, it represents a first attempt to create an independent scoring model that enables consumers and organizations alike to evaluate the various credentials vis a vis alternatives and for those organizations wishing to add new credentials, be able to make decisions about how to position them to fill niches and to aid in continually improving existing credentials.

RESULTS

Having tried to be realistic in defining the expectations of what this model can or does show and what it does not or cannot show, the graph below illustrates the rank ordered results of this research.

As can be seen in the graph, due to NO work experience or educational requirements combined with what appears to be a relatively weak testing/assessment process, the PRINCE2 credentials are significantly lower than all other credentials, while at the other extreme, AACE?s top credentials, the C3PM and the CFCC scored very high, based on the requirements of high levels of demonstrated experience, a Master?s degree or equivalency in terms of education and extensive peer review of papers, work outputs and other indicators of COMPETENCY.

Applying the ?sniff test? (do the numbers make sense?) once again relying on first-hand experience with both PMI?s PMP and AACE?s CCC/E, I know from taking both exams and passing them, as well as 20+ years of experience providing training helping others to prepare to take these exams, that the level of preparation effort to earn the CCC/E (~240 hours of study time) is roughly double that of the PMP (~120 hours of study time). With a PSCOR score for the PMP of 9624 and the PSCOR score for the CCC/E of 13261, (138% vs 200%) is an indication that the model still needs fine tuning in this area. Either the lower scoring credentials are being over-rated or the higher scoring credentials are being under-rated. This clearly needs more research.

Another ?sniff test?- There seems to be a general consensus that the PMP (PSCOR = 9624) is higher than asapm/IPMA?s Level D, (PSCOR ? 6132) but lower than asapm/IPMA Level C (PSCOR = 11106) and in this case, the PSCOR scores reaffirm what many have been stating intuitively, but have had no empirical proof supporting this claim. This would put PMI?s PMP a little over half way between the IPMA Level D and IPMA Level C. While a debate which has raged for many years now, this model or some refinement, will help quantify exactly how these relate to one another.

The last of the ?sniff tests? which alerts us that the scoring model needs more refinement, is the PMP, with a PSCOR of 9624 when compared against the PRINCE2 Practitioner (P2P) with a PSCOR of only 78 also indicates the need for much more research on how to fairly and accurately ?score? the various credentials. Knowing the PRINCE2 credentials to be ?credible?, at least based on market acceptance, is this model being fair? And if not, what is the right or best model which will provide the most accurate assessment?

There is one additional factor which is worth introducing as part of this research. In Malcolm Gladwell?s ?Outliers?[13], he asserts pretty convincingly that it takes 10,000 hours of honest, dedicated effort to become a ?top ranked professional? at anything. (Sports heroes, musicians, such as the Beatles, artists, computer programmers, such as Bill Gates and Bill Joy, and one would hope, Project Managers?)

Consistent with Gladwell?s ?10,000 hour? baseline, I have drawn a line across the graph below indicating which of the credentials meet or exceed that criteria and which ones do not.

As can be seen, the IPMA C and the AIPM CPPM both just meet Gladwell?s 10.000 hour cut off, but perhaps more importantly, PMI?s PMP, as being the most ubiquitous, just misses meeting this ?superior performer? threshold. While all three are close, it would behoove all organizations, especially OGC/APM, but also PMI to reconsider their certification requirements in light of Gladwell?s research and perhaps reconfigure their certification program to be more challenging?

GRAPH 1- Rank Order of Certifications Based on the Cumulative Score of Experience and Assessment

GRAPH 1- Rank Order of Certifications Based on the Cumulative Score of Experience and Assessment

GRAPH 1- Rank Order of Certifications Based on the Cumulative Score of Experience and Assessment

Also noteworthy is that PMI?s PgMP does meet the 10,000 hour baseline, which would, at least according to Gladwell, qualify it to be a credential appropriate for ?superior? practitioners to strive for (along with of course the even higher scoring INCOSE, IPMA A Level, AIPM?s CPPD and AACE?s CFCC and C3PM)

Some other observations worth exploring in more detail-

It is interesting to see the clustering between the AACE family of certifications and those of INCOSE, as both organizations are heavily influenced by engineers. Follow on research could uncover what, if any differences an engineering perspective might bring to the process of creating professional level credentials. (Keeping in mind that Engineering IS recognized as a profession, while project management is not, at least according to published research by Zwerman, Thomas et al and myself)

Also worth noting that the top ranked credentials are NOT coming from PMI, which is without question the largest and most influential of the professional organizations purporting to represent practitioners of project management, but are dominated by the much less well known organizations; AACE?s Certified Portfolio, Program and Project Manager (C3PM) and Certified Forensic Claims Consultant (CFCC); followed by AIPM?s Certified Project Director (CPPD); asapm?s Project Director Level and Portfolio Manager. (Worth keeping in mind is IPMA A level, represented by the asapm-AD, -AF and -AG remain under development at this time, and the evaluation criteria used in ranking them was preliminary and subject to change)

The fact that the largest and most powerful organization representing the practice of project management appears not to produce the top ranked credentials has or should have important implications for those individuals considering obtaining these credentials or for those companies interested in specifying which credentials are equivalent (important for transportability or mutual recognition) or are required for a particular job specification. (See recommendations for more on this topic)

ANOMOLIES OR OBSERVED DISCREPANCIES IN THE MODEL

An example of one concern which was hard to evaluate or account for without further and much more refined research can be illustrated by the anomaly between the asapm/IPMA C and B levels. While the hours of experience remain the same, the DIFFICULTY of the projects managed increases. That increasing level of difficulty is not captured in this model. (Theoretically, it would be captured during the assessment process) Also, the asapm/IPMA C requires BOTH an exam and an assessment, while the asapm/IPMA Level B does not, which results in the IPMA C, with a PSCOR of 11106 actually scoring HIGHER than the asapm/IPMA Level B, with a PSCOR of 11033, when the asapm/IPMA B is a higher level credential than the asapm/IPMA C.

A related issue we find in the asapm/IPMA C is that exam requires both multiple choice AND written short answer questions. We find a similar issue with AACE?s Certified Cost Engineer Credential (CCC/E). Not only does it require multiple choice questions which require fairly extensive calculations to derive the right answer, but 1 / 4 of the value of all AACE exam based credentials require a narrative or short essay report. Is written narrative or essay type answers more difficult than multiple choice questions? While we may believe intuitively they are, the model presented here is not fine enough to pick up those nuances.

Another challenge faced in creating a relative scoring model was considerable inconsistency in how the value of a 4 year degree was calculated by the various organizations.

Again, based on first-hand knowledge with both PMI and AACE certifications, I will use them as examples. PMI requires 7500 hours of experience for anyone with less than a 4 year degree and 4500 hours of experience for a person WITH a 4 year degree. Simple subtraction yields the value of a 4 year degree according to PMI as being 7500 ? 4500 = 3000 hours of work experience being equal to a 4 year degree. Compare this to AACE. AACE requires 8 years of full time experience, which means 16,000 hours, assuming working full time, 40 hours per week X 50 weeks per year[14]. And AACE allows 4 years of the 8 year requirement to be fulfilled by a 4 year degree. By performing the same set of calculations- 16,000 -8,000 hours = 8,000 hours equivalent value for a 4 year degree for AACE. How then to keep this wide variation- 3,000 hours equivalent working hours for a Bachelor degree for PMI vs 8,000 hours accepted by AACE for a 4 year degree- from skewing the results?

To mitigate this extremely large discrepancy in how the different organizations calculate or determine the value a degree, we chose to create a STANDARDIZED value for a degree and to apply that in all cases where a degree was required. (Note: Not all certifications required a degree) See the previous calculations, but for a Bachelors degree, (BDEG) it was equated to 5000 hours based on the Level of Effort, while the Masters Degree (MDEG) was set at 1920 level of effort.

Now is it fair or appropriate to equate an hour spent in University to an hour spent working? Probably not, but what is or should be the relative weighting?

Lastly, if we take the PSCOR for all credentials and compare it against just the level of effort for all credentials, we find that the level of effort comprises on average, 85%, while the testing and assessment process only comprises 15% of this scoring model. As asked above, does this make sense? Is or should the testing and assessment process count for more? Intuitively, it seems as though the answer should be yes, but that too will have to be the subject of follow on research.

Along with that question, we have to ask whether it means the academic and experience is being weighted too high or does it mean the testing and assessment processes need to be more robust to increase the weighting factor? All questions pointing to the need for more research??

LIMITATIONS TO THIS RESEARCH and OPPORTUNITIES FOR FOLLOW ON RESEARCH

A common theme throughout this paper is that, it was a very preliminary, experimental research approach and the model created is not without its weaknesses. And that every attempt was made to identify the weaknesses, with the hope and expectation that, hopefully, this paper will serve as a catalyst, igniting interest in further study on this topic. To summarize the limitations of this model and to set the stage for further research, below are the key topics

1) RELATIVE DIFFICULTY OF THE EXAMS– It is impossible to compare the relative difficulty of the different exams without being able to access at least a representative sample of questions from each. Based on first-hand experience, having taken both the PMP and CCE exams, I can say with absolute certainty that the AACE exams are by far much more difficult than the PMP, but unless one has taken all the certifications, it becomes impossible to make a fair comparison. There needs to be some way of accessing, comparing and rating the relative difficulty levels of the exams.

2) PASSING SCORES– On one hand, we have the PRINCE2 credentials, which require only a 50% to 55% grade to pass, while the AACE Exams require a 70% score on the problem solving sections and the narrative portion of the exam is graded on a pass/fail basis. In between, we have the ever popular PMP, which at least until a year or so ago, required only 106/175 questions correct or 60.6% to pass. Once again, as a professional, I have a hard time justifying hiring anyone who scored less than 70% on a standardized exam, and there is no data available to see if the passing grade is consistent with the difficulty of the exam. Once again, there needs to be some consistent, INDEPENDENT approach that can match the passing grade with the difficulty of the exam questions, not only internally, but between one organization and another.

3) DIFFERENT TYPES OF QUESTIONS– As noted above, both the AACE and asapm exams consist of both multiple choice and narrative style questions. While the PMP consists only of multiple choice questions. As written communications is part and parcel of being a project or program manager, I question the use of multiple choice only credentials. And based on firsthand experience with the PMP exam, I question whether the tricky English is really testing for project management knowledge or is it a defacto TOEFL test? In this case, what is needed is to develop a sound psychometric test which mixes multiple choice, essay, matching and fill in the blank questions that will consistently measure differing levels of knowledge.

4) TOTAL NUMBER OF QUESTIONS and TIME LIMITS– Another cause for concern are the total number of questions and the average time allowed for answering each question. PRINCE2 Foundation consists of 75 questions with a 1 hour time limit; (0.8 minutes per question) while AACE?s Certified Cost Engineer (CCE) credential consists of 84 questions spread over 7 hours. (5 minutes per question) Compare against the PMP which allows 4 hours to complete 200 questions. (1.2 minutes per question) Surely with this kind of spread, not all these written exams could possibly be equal in difficulty, depth or breadth of coverage. Here again, independent research needs to be done to establish some standards which can be applied or adopted between the various professional organizations.

5) CALCULATING THE VALUE OF A BACHELORS DEGREE– As noted earlier, there is a huge discrepancy in how the value of a degree is calculated in terms of level of effort. I will address this under recommendations, but suffice it to say, it should be in the best interests of all professional organizations to be more conservative or at least more realistic in calculating the value of a degree vis a vis work experience.

6) VALIDATING WORK EXPERIENCE– Here again, the practice of the different professional organizations varies widely. PMI does not validate whether the work experience claimed was on successful projects. They do audit 10% of the applicants, but only verify that the hours claimed were correct and not whether the project was a success or failure or whether the work done by the applicant was satisfactory. With the exception of AACE?s C3PM, which requires that a logbook of work experience be submitted, AACE, like PMI, does not actually verify the work experience. One of the recommendations suggests this to be a weakness all organizations should address in a more proactive and aggressive manner.

7) Determining the ideal or appropriate ratio of Testing and Assessment vs Education and Experience- This too is an area which needs much more rigorous research and assessment. As all of us want to raise the professional image of the practice of project and program management, we need to look at the ratio of experience and education compared against doctors, lawyers, commercial airline pilots and accountants to see if project management is consistent.

CONCLUSIONS and RECOMMENDATIONS

I hope it was and remains clear to everyone that creating this preliminary model was an EXPERIMENT which was originally designed to :

1) see if it was feasible to produce a meaningful ratio scale against which to rank order and compare the relative standings of the various credentials from published information by each professional organization, readily available from their respective websites, and;

2) generate sufficient interest for others to carry this research forward in a more academically sound and rigorous manner.

I think the model created while admittedly containing anomalies, established that using the RIGHT information published by the various professional organizations COULD be used as the basis for toes to toes and nose to nose comparison. Having passed most, but not all of the ?sniff tests? it seems that the results from THIS model are generally reliable at least in the context of the 4 variables- experience, education, testing and assessment. Stated another way, while it works for most credentials, the model needs further research to and refinement in order to be truly representative of ?the truth, the whole truth and nothing but the truth? as it pertains to the relative standing of the various credentials vis a vis one another.

Which brings up the second question- has this article attracted enough attention and generated sufficient interest for researchers to be willing to invest the time and effort and for the professional organizations being willing to fund such research and open their testing and assessment processes to independent analysis?

I think practitioners and those who hire our skills and services have the RIGHT to be able to evaluate the various credentials, based not on marketing hype or urban legends, but based on sound, academically rigorous comparison, and for that reason, I am issuing a challenge to other practitioners and academics to join with me in pressuring the professional organizations to open up their credentialing process in a transparent manner, enabling a ?Truth in Credentialing? evaluation so we can see what a credential actually tests for and what it represents in terms of producing COMPETENT practitioners[15].

Shape Your PM Career
Get started and get ahead in your project management career.

So what are the recommendations deriving from this piece of preliminary research?

1) For those organizations or individuals who feel their credential has been diminished by or are otherwise unhappy with the results from this research, instead of ?shooting the messenger?, why not accept the challenge, either creating a model you feel will more accurately enable the scoring of ALL credentials OR increasing the PSCOR score of your organizations credential by increasing the requirements- WEXP, BDEG, MDEG, ARTH, EXAM and ATCA. I suspect that there is an optimum mix of these variables which will produce the most competent practitioners in the shortest time at the least cost.

2) Taking the cue from the US Supreme Court findings in North Dakota and Florida, all professional organizations should seriously consider dropping the ?experience in lieu of a degree? at least for their mid and top level credentials. In today?s highly competitive, global environment, nearly all ?professional? positions require at minimum, a 4 year degree.

3) Consistent with the professionalization of other occupations such as Nursing, Civil Engineering and Social Work, all professional organizations should start to consider requiring a Masters Degree for their top level credentials. (This is consistent with organizations such as the World Bank, UN Projects Office and NGO?s such as USAID, AUSAID etc in the hiring of their professional consultants)

4) Consistent with the rather stringent Consumer Protection laws in most countries today, and given at least some of these credentials are being used as requirements- defacto licenses- to screen potential employees, all professional organizations should be willing to open their exam databases to suitably qualified researchers to come up with an INDEPENDENT measure to determine what they DO measure. (Knowledge? Skills? Attitudes? Competencies?) Then publish that information on their exam websites or .pdf downloads, so those seeking the credentials and those hiring the people who hold those credentials know what they can and should reasonably be able to expect. As equivalency, transportability and reciprocity are important issues globally, it is essential that a way to create a meaningful INDEPENDENT comparison of the various credentials be developed and published.

5) As evidence is growing that at least some of the more mature organizations are becoming disillusioned with and moving away from knowledge based credentials and moving more towards competency based credentials, all professional organizations interested in seeing the credibility and market acceptance of their certifications growing, should be taking steps to ensure that the work experience is validated, not only auditing to ensure that the hours worked are accurate, but that the quality of the work produced was done in a professional manner and that the work being claimed was either on ?successful? projects, or that if the project failed, that the ?lessons learned? were captured and communicated. Use of logbooks, 360 degree or Balanced Scorecard type evaluations should become part and parcel of all certifications.

6) IF practitioners want to improve the professional image of the practice of project management, a formal internship/mentorship program should be incorporated as a requirement. Following the example of the Engineering profession, after the CAPM, CCT, CPPP, IPMA D etc, but BEFORE being allowed to qualify for the PMP, CCC/E, CPPM, IPMA C, there should be a formal Career Path Development Program in place with the work being logged and a 360 degree peer review required. (As with PMI?s PgMP and AACE?s C3PM)

To conclude, if anyone is interested in this topic for your Masters or PhD dissertation, I would be willing to serve as an advisor or supervisor for you in conducting a more rigorous, academically defendable study.

[1] http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers&discussionID=7074372&gid=35313&commentID=9547495&trk=view_disc

[2] www.globalpmstandards.org

[3] Giammalvo, Paul D. ?Is Project Management a Profession? And if not, what is it?? PhD dissertation, 2007

[4] Pierce v. AALL Insurance Inc. [513 So.2d 160, 161 (Fla. 5th DCA)],

[5] Garden v Frier [602 So.2d 1273 (FL 1992)]

[6] Jilek v. Berger Electric (441 N.W. 2d 660 [ND, 1989]

[7] This value was taken from the published requirements on the various certification websites and/or downloadable .pdf files.

[8] For the purposes of this experiment, the assumptions used in calculating BDEG were:

  1. The average project management undergrad degree required 130 credit hours for graduation;
  2. That for each 3 credit hours, 40 hours of class time was required;
  3. That for each 40 hours of class time, 2 hours of homework, research, writing or outside work was required by the student.

[9] For the purposes of this experiment, the assumptions used in calculating MDEG were:

  1. The average project management graduate degree required 36 credit hours for graduation;
  2. That for each 3 credit hours, 40 hours of class time was required;
  3. That for each 40 hours of class time, 3 hours of homework, research, writing or outside work was required by the student.

[10] As only PMI REQUIRES training prior to taking the PMP exam and because that training can be fulfilled by simply studying books of sample questions or listening to a podcast, I did not count it as being equal to academic course work and counted the hours only, with no outside or additional effort. (See exam prep effort below)

[11] Based on inputs received from several sources and based on firsthand experience, I assumed 30 hours of preparation for each hour of exam. For the PMP only, I deducted the required 35 hours from the total. (4 X 30 =120 -35 = 85

[12] To calculate the

[13] Gladwell, Malcolm, ?Outliers?, 2008, Penguin Press, Chapter 2, pages 38-76

[14] For the purposes of calculation, as rarely do project managers work a ?standard? 40 hour week, it was assumed that the hours of overtime offset the sick days, holidays and any non-project based work, which also may or may not be true and requires further research.

[15] Competent is defined by the Merriam Webster?s dictionary to be ?the quality or state of being functionally adequate, characterized by marked or sufficient aptitude + attitude + skills + strength + knowledge?

[16] Source and supporting Excel and Word artifacts

LIKE & SHARE THIS ARTICLE

4 thoughts on “Project Management Certifications Compared- A Preliminary Comparison

  1. Jewell

    Nice weblog here! Additionally your site rather a lot up very fast!
    What web host are you using? Can I am getting your affiliate hyperlink in your
    host? I want my site loaded up as fast as
    yours lol

    My blog post – website (Jewell)

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *