This report presents the results of a survey of the engineering analysis and simulation industry to determine its views on the competency framework and competency-based registered analyst scheme.
A major goal of the EASIT2 project was to contribute to the competitiveness and quality of engineering design and manufacturing in Europe through identifying the generic competencies that users of engineering analysis and simulation systems must possess. This competency framework includes a comprehensive Educational Base, a web-based interface and the ability to interface with other staff development systems, with links to associated resource material that engineers and analysts can use to develop and track their competencies. The project also delivered an integrated Registered Analyst (RA) Scheme to provide recognition of achievement of these competencies. In order to help ensure that the deliverables of the project enjoyed as much industry uptake as possible, clearly a survey of industry needs was essential.
The survey itself comprised an online questionnaire of 34 questions taking about 15 minutes to complete. It was completed by 1094 respondents from 50 different countries. About 28,000 invitations were sent out across a broad range of industry sectors in all EU countries, using the NAFEMS and partner contact databases. Approximately 3% of those receiving invitations completed the questionnaire, which is a typical response rate for such surveys. The high number of responses indicates that the subject is of high interest. All the metrics in terms of overall response rate, industry sector, company size and seniority, set at the start of the survey were achieved. The margin of error for the survey results was estimated to be ±3%.
The majority of respondents were engineers/analysts and senior engineers, although Project Managers and Directors were also well represented. The educational level of about half the respondents was to Masters Degree level, with about a third reaching Doctorate level. Respondents were well distributed across all age groups and were generally well experienced in computer-based engineering analysis.
The largest organisations (500+ employees) had a greater proportion of near full-time analysts, while smaller companies and small to medium-sized enterprises (SMEs) had a higher proportion of part-time analysts. This suggests that engineering analysts in small organisations face a greater challenge in developing their engineering analysis skills because they often need to multi-task.
When asked how they felt their formal education related to their engineering analysis activity, it appeared that the higher the education level of the respondent, the higher it related to their engineering analysis activity. Nevertheless, among even those holding a Doctorate, less than a third felt that their formal education related fully to their engineering analysis activity, so there is clearly a need for lifelong learning in engineering analysis at all education levels since a large majority of engineers are performing analysis tasks that are not fully covered by their formal education.
Half of the respondents to the survey worked at large organisations (500+ employees) while a significant proportion (20%) also worked at very small organisations of 1-20 employees. The proportion of respondents working in SMEs was 40%, so both large organisations and SMEs were well represented in the survey. In general, the larger the organisation, the larger the number of analysts. However, the number of analysts in the respondents’ organisations varied widely with a significant number of organisations (7%) having only a single analyst while about a quarter of respondents had 100+ analysts in their organisation. This wide distribution suggests that in terms of staff development in engineering analysis there is a need for both a personal approach and a company-wide approach.
The industrial sector of respondents’ organisations were well distributed among the 9 categories (Energy, Aerospace, Land Transport, Civil & Construction, Consumer Goods, Marine & Offshore, General Industrial Goods, Petrochemical & Process and Defence) with each accounting for between 6% and 15% of all responses. Many organisations’ activities included Manufacturing (20%), Design/consultancy (27%) and Research and development (28%).
In terms of barriers to the use of computer-based engineering analysis, the highest rated were “recruitment” and “lack of analysis skills”, clearly indicating a need for an increase in the pool of competent engineering analysts and improved lifelong learning.
Respondents also ranked reasons why their organisation fails to get the most out of engineering analysis and simulation software, with the highest being “pressure of work”. This shows that lifelong learning needs to be fitted around analysts’ workload rather than add to it. “No convenient external training” was also ranked high in small countries and in small organisations.
Over half of respondents stated that the competences needed to perform analysis tasks in their organisation are not formally defined and a significant majority (70%) stated that they had no system to record analyst skills. This shows that there is a significant need for the deliverables of the EASIT2 project. The proportions were even higher in SMEs and organisations with small analysis departments, demonstrating that they are in the greatest need of support in terms of defining and recording analyst competences, perhaps because they lack the resources to do this themselves.
When asked whether a system that defines analyst competences and provides links to appropriate training resources would be useful, a large majority (81%) responded “Yes” which is a resounding vote in favour of the main objective of the EASIT2 project.
The most favoured media for a competency framework were intranet, particularly by larger organisations, or secure website. In existing systems, company intranet was also the most common medium. In order to assess attainment of analyst competences, assessment by manager/mentor was rated as the most useful, which was also the most common method in existing systems. The second most useful was “self-assessment”, closely followed in third place by “on-line/computer-based test”. A three-level system for rating analyst competencies was preferred by nearly half of respondents, followed by four, then five levels.
The importance of a list of analysis areas was ranked by respondents with Finite Element Analysis being ranked the highest. The top 7 ranked analysis areas already exist in the CCOPPS Educational Base for the analysis of pressure vessels, and these were modified and enhanced with additional competences in the generic Educational Base of the EASIT2 project. The others were developed from scratch.
There was strong support for a professional qualification in engineering analysis and there appears to be potential to adopt such a qualification as a legal requirement in many activities. Respondents felt that it would be equally useful for each the following reasons: Incentive for staff development, Recruitment, Subcontractor qualification, Internal resource management and Marketing. Marketing was rated significantly more highly among SMEs and lone-analyst organisations than larger organisations.
The preferred assessment methods for a professional qualification were Professional Interview or Manager/Mentor. There were significant national differences however, with the UK preferring Professional Interview and External Assessment of submitted work, and the USA External Examination. Perhaps not surprisingly these are also the methods used for professional engineer assessment in these countries. A monetary fee for such an assessment of up to €200 appeared to be acceptable to about half the respondents.
In conclusion, the survey has confirmed the timeliness and objectives of the EASIT2 project. The findings of the survey were used to ensure that the project deliverables met industry needs as fully as possible.