Need help getting started?
Talk to us today about how Insync can help your organisation measure and improve employee engagement, customer engagement and organisational capability.
Benchmarking board performance: 500 board reviews later
Unlock evidence-based insights into board performanceWe’ve spent over 25 years understanding what success means to organisations like yours.
We are the catalyst for and the facilitator of meaningful change – with our suite of employee experience, customer experience and board and governance frameworks and tools, decades of industry experience and our foundation in psychology, research and business consulting.
Insync differentiates through our:
In-house surveys have many limitations, including:
Well-designed survey statements help you achieve your survey objectives regardless of the issues your organisation faces. Asking the right questions will help uncover problem areas and confirm your gut feelings. This knowledge can be used for decision making to increase your organisation’s performance and profit.
With a 25+ year history, we’ve provided stakeholder surveys to some of the largest organisations in the Asia Pacific. Read more about our experience here.
Yes. We have the tools and partners to develop surveys in any language. We can collect survey responses worldwide and around the clock, via paper and online forms. We’ve conducted over 1,000 employee, customer and board research projects in the last five years.
We’ve done surveys in over 40 languages in around 100 countries. Languages we’ve developed surveys in include: French, Arabic, Polish, Hungarian, German, Italian, Spanish, Portuguese, Mandarin, Hindi, Tagalog, Korean, Japanese, Vietnamese and more.
Yes. Once your organisation has established a baseline, we can track the progress of your organisation’s results against previous years’.
If you’ve previously used a survey from another provider, we can incorporate some of the questions of the survey into the current survey. This will be used to draw historical comparisons providing the surveys are of a scale comparable with Insync’s survey rating system.
Yes. We can customise our surveys to better meet your organisation’s needs. Examples of what you can do include:
Insync recommends using a seven-point scale. Seven-point scales are used because they have a clear middle point and there are just two choices between the middle and end points. This is ideal for capturing variations in opinions without presenting too many choices (leading to indecisiveness) or too few (meaning that variation in response is lost).
Most people would regard an average response of 5 on a 7-point survey response scale as good. A consideration of how this average response compares to an external benchmark may however tell us a different story.
For example, survey statements with an average response of 5 can have a benchmark ranking as high as 90 out of 100 (i.e. a very high ranking compared to other organisations), or as low as 15 out of 100 (i.e. a very low ranking compared with other organisations).
Put another way, an organisation that considered a survey statement with an average response of 5 to be good, would be making an inappropriate conclusion if the majority of other organisations responded to the same survey statement with an average of, say, 6.5 on a 7-point scale.
Benchmarking your organisation’s responses against similar organisations gives you a baseline for your performance internally and shows where your organisation sits compared to others externally. This will reveal what areas you need to focus on to hit your goals and get ahead of competition.
For people who have varying levels of access to computers, email and telephone, Insync can administer surveys online, on paper forms, via kiosk, via SMS or mobile, by robocall, over the telephone and even face-to-face.
Depending on the format of the survey, there are different ways we distribute our surveys.
Online: By providing us with a list of email addresses of survey participants, we have a secure system that sends out all the survey invitations for you, including reminder emails for those who haven’t yet completed the survey.
Paper:
We’re flexible and open to your preferred method of distribution.
All of our online applications are designed to be used by multiple browser platforms, including current and previous versions of Safari, Chrome, Internet Explorer, Mozilla Firefox and Opera. We come across a diverse range of IT environments throughout our work with public, private, educational and not-for-profit organisations and have adapted our technology accordingly. Our surveys do not require respondents to install browser plug-ins or special software to complete a survey. We understand the IT security and management needs of our clients and their diverse stakeholders.
Strong levels of communication are essential to a successful employee or customer survey. Pre-survey communication should explain:
Our survey invitations will reinforce why you are undertaking the research and also have detailed instructions for completing the survey.
Organisations that achieve the best response rates do not to rely on a single method of communication, but use multiple methods. Examples of these include posters, articles in newsletters or e-bulletins, staff briefings, flyers or reminders in payslips.
Research projects that are supported by the CEO of an organisation are far more likely to be taken seriously by participants.
Prizes are also a good way to encourage participation and thank people for their help.
Post-survey communication is also very important. Respondents expect to be informed of the results and subsequent improvement initiatives. Most of our clients involve their CEO or very senior staff in their customer and staff debriefs. Insync’s research experts and registered psychologists can also attend and support all debriefs; this is particularly valuable to help your audience extract meaning from data.
It’s essential to communicate the results of an employee or customer survey, regardless of the outcomes of the survey. Failure to do so can lead to mistrust in an organisation’s leaders, negativity towards the survey process, and can undermine any follow-up actions. Survey results are typically reported back to survey participants via presentations, town hall meetings or through electronic or hard copy notifications.
Organisations that achieve the best traction with customers and employees following a survey have tended to take a multi-tiered approach to responding to the survey results. That is, feedback and initiatives have been communicated to customers face-to-face. Or for employees, the feedback will have been cascaded down through a number of levels (e.g. whole-of-organisation, business unit, work team).
There are many ways to increase your survey response rate. Click here for our list of methods.
There are three main phases of a customer and employee survey project:
Pre-survey phase – we begin with a planning and requirements session. The survey is then customised to suit your needs. It’s ready for launch when the online and/or paper form design is approved and when your organisation has finalised survey communications.
Survey administration phase – participants are invited to complete the survey. Your Insync research project manager will monitor response rates to identify the need for follow up communications. A hotline is available for participants to call if they have any questions.
Post-survey phase – after we close the survey, Insync will analyse the survey responses, benchmark your results and compile a comprehensive report. The report shows results using tables, graphs and charts. It provides both a high level overview of the results and a detailed breakdown of participants’ responses.
A highly qualified research project manager will then conduct a full presentation of the results with your nominated customer group or management team.
Optional services – depending on the product, other support services at additional cost may include but is not limited to: custom analyses, a results bulletin, management education sessions, focus groups, additional presentations, etc.
Here is a summary of the three main phases of a customer and employee survey project and how long it generally takes:
Pre-survey takes approximately three to four weeks. This phase begins with a planning and requirements gathering session and incorporates all pre-survey communication.
Survey administration typically lasts for two to three weeks. Throughout this phase, research project managers work closely with the survey facilitator to provide updates on response rates and follow up actions to improve the response. In addition, a hotline is available for participants to call should they have any questions.
Post-survey involves Insync analysing the survey responses, benchmarking your results (if applicable) and compiling a comprehensive report. This report will be available usually within three to four weeks after the survey close. A research project manager will then present the results with the nominated customer group or management team.
A univariate rating scale involves a single quantitative rating for each survey statement. The most common type of univaritate response assesses agreement with a particular survey statement. For example, to what extent do you agree with the statement that “senior leaders are respected”.
Univariate rating scales are used in the following Insync product: Employee Engagement Survey.
A bivariate rating scale on the other hand involves two quantitative ratings for each survey statement or question. The most common type of bivariate response assesses both “importance” and “performance” in relation to a survey statement. For example, how important is “financial accountability” to you and how is the organisation performing in terms of “financial accountability”.
The difference between the importance and performance is called the “gap”. The gap measures the difference between the importance of an issue to an organisation and the performance of the organisation. If the performance of the organisation in relation to an issue is rated significantly lower than the importance, it signifies an area for improvement.
Bivariate rating scales are used in some of the following Insync tools: Entry Interview and Exit Survey.
If your question isn’t answered here, please contact us or email your question to: info@insync.com.au
A leadership development survey where colleagues provide feedback so the participant’s emotional engagement and competencies can be improved in line with the organisation’s direction.
Translating survey findings (answers) into actions (improvement initiatives) in a structured way, resulting in steps to achieve goals for improving organisational performance.
Learn more about action planning
A secure and private online facility hosted by Insync to review the organisation’s survey results and improve them by committing to tasks.
Day-to-day employee operations that are in tune with the organisation’s direction and strategy.
The interpretation of raw survey data into meaningful insights which are used to inform actions. Statistical analysis uses the application of mathematical models to assess the levels of significance of findings and to determine the degree of significance with other supporting evidence which adds weight to the credibility of survey findings.
A reduction or decrease in numbers of staff within an organisation. It may be interesting for an organisation to understand why attrition rates are high. An exit interview tool captures departing employees’ opinions so the organisation can devise initiatives to reduce employee attrition and increase employee engagement to save money and/or boost productivity.
Survey data that can be used for comparison or reference purposes. The first time a survey is conducted will establish a baseline from which the organisation can build.
Results compared to a group of other organisations that have conducted the same survey in a recent timeframe. A benchmark may be composed of similar organisations’ results in the same industry, or all organisations in Insync’s benchmark database.
Compares two factors across the same survey items to generate a gap score. A bi-variate scale usually comprises “importance” and “performance” scales. It measures how “important” a survey concept is to respondents, then compares how the organisation is perceived to be “performing” against the same survey item. A gap score can then be measured. A significant gap between importance and performance signifies a potential improvement opportunity.
Allows certain questions to be asked based on the respondent’s response to the previous question. Branching within a survey allows different questions to be asked of various groups and can also simplify the layout of the survey for those completing it.
This research method provides a script for the interviewer to follow during a telephone interview and answers that the respondent provides will then shape which questions come next. The interviews are documented in this system.
An integral part of conducting any survey is protecting the participants’ anonymity. Surveys should be designed with informed consent and confidentiality in mind. To this end, survey data is typically analysed based on aggregate responses.
The degree to which one can be confident that the data reflects the true opinion of the sample group and that results are not due to chance. Most research operates on a 95% confidence level.
The set of shared beliefs, values, goals and practices that characterises an organisation.
A survey that is designed to meet the customer’s unique needs. Custom surveys can be developed when the client’s requirements fall outside the scope of our standard suite of products.
A survey sent to a selected segment of customers which measures perceptions, satisfaction and engagement with an organisation and their products and services. In addition it can make predictions about customers’ future purchasing behaviour and advocacy.
Learn more about customer surveys
Data which bears no names or any part of a response that will make a individual respondent easily identifiable.
The different groups of survey participants e.g. job level, gender, age range, etc. Demographics shape the degree of detail that survey data can generate.
The sub-categories of demographics, e.g. “female” would be a demographic class of the demographic “gender”.
This may be face-to-face, over the telephone or with video software. A “depth” is a structured conversation to inform specific research objectives e.g. to flesh out or validate survey findings.
The degree to which an organisation supports differences between employees based on culture, ethnicity, gender etc.
Learn more about gender diversity and inclusion in the workplace
Things that can be manipulated to have a positive effect on something else. E.g. there are drivers that help improve employee engagement but they do not engage employees directly. Therefore, targeting improvement initiatives on the drivers of engagement may help to boost employee engagement.
The extent to which employees engage in a personally meaningful, mentally active and productive manner at work, consisting of three inter-related components:
Learn more about employee engagement
An organisation’s ability to fulfil the physical, emotional, and psychological needs of its employees.
Surveys individuals in the workplace. The employee survey is generally non-compulsory and initiated for a number of reasons such as to:
Learn more about employee surveys
As opposed to an exit survey, an entry survey looks at incoming employees to measure their perception of induction and on-boarding processes and whether the organisation has met their expectations. Entry surveys are usually conducted at the three month mark and can be repeated at six or nine months.
Learn more about entry surveys
Standards for professional conduct which include obligations to protect and inform survey respondents of how the research will be used.
An interview which explores employees’ motivations for leaving an organisation. Exit interviews benefit organisations by:
Learn more about exit interviews
The underlying themes or areas which collectively indicate alignment with organisational goals. They may include senior leadership, long term direction, investment in systems, investment in people, etc.
A trained researcher who will run your workshop, action planning, focus group or depth interview.
A structured discussion amongst a group of individuals which is facilitated by a trained researcher (the “facilitator”) to inform specific research objectives.
Learn more about employee focus groups and customer focus groups
The day that a survey is launched and sent to potential respondents. Also see “Launch date”.
If survey respondents have already been assigned to particular demographics and related sub-classes the demographics are hidden. They will not have to select a demographic on the survey.
An individual who answers the survey in an aggressive, rude or inappropriate manner.
Something that is offered in return for participation in a survey; examples may be financial, a charity donation or entry to a prize draw.
The time after which the survey has been sent to potential respondents. There is a limited time (i.e. 2-3 weeks) for individuals to respond to a survey. After that time the survey is closed and responses are no longer accepted.
KPIs are sometimes linked to survey results as a measure of performance for managers. *This should be done with caution as there are instances where survey findings are seemingly skewed amongst certain cohorts representing a social desirability bias.
Another way of expressing the date that a survey is sent to potential respondents. Also see “Go-live”.
A psychometric scaling method, measuring positive or negative responses to a survey. A Likert scale may contain options where the respondent selects their level of agreement or disagreement.
The term used to describe the average. The mean is defined as the total of the scores divided by the number of scores.
The defined process that the research follows which may determine things such as demographic selection, sample size, data collection.
Insync’s project management methodology based on four key principles that guide decision making and conflict resolution processes to ensure projects meet the required deadlines, scope and budget.
A question that has multiple responses available. Respondents can often select all that apply rather than just one answer.
A question that invites individuals to write their own responses rather than indicating level of agreement on a numeric scale.
The proportion of respondents who indicate positive agreement e.g. by responding with a 6 or 7 on a 1-7 point scale.
When a survey is conducted on a small scale, generally with the intention to test reactions or gather feedback, enabling the researchers to revise their strategy prior to wide-scale roll-out.
An electronic access point which enables permitted individuals to view, download and distribute research findings. This may incorporate an action planning portal.
Communication to potential survey participants to prepare them for the upcoming survey. These may include posters, bulletins, manager briefings, survey invitations, survey introduction letters, email sign-offs, collateral etc.
Provides all survey stakeholders with a visual representation of the tasks, responsibilities and timeframe of the entire project.
Open-ended questions are asked which aims to understand human behaviour and the reasons for decision making. It generally focuses on fewer people than quantitative research, but provides deeper understanding via depth interviews, focus groups or written feedback.
A systematic process applied to all projects which involves thorough review of analysis to prevent errors.
Structured questions are asked and responses are limited to numerical/scaled responses. It is administered to a large number of respondents to gather trends for statistical, mathematical or computational methods of analysis.
The points in the data that divide it into quarters. These points also cut off the lowest and highest 25% of the distribution. Any result above the 1st quartile point is in the top 25% of results. Anything below the 3rd quartile point is in the bottom 25% of results.
Where an organisation’s results fall relative to the benchmark database if the data was plotted on a standard distribution curve and divided into ordered quartiles.
The statistical results from the survey untouched in their purest form. Raw data from qualitative research may constitute tape recordings of conversations or unedited written/typed responses to questions.
A technique for modeling and analysing several variables simultaneously, whereby the focus is to determine the relationship between a dependent variable and one or more independent variables.
Responses from a statistically representative population from of all areas of an organisation.
An individual who has been invited to and participates in a survey.
An organisation’s ability to retain employees to maintain capability and productivity levels for optimum performance. Understanding why employees leave an organisation by conducting exit interviews is an effective way to manage retention levels.
The group of respondents who participated in a survey.
A survey in which respondents participate of their own free will.
In a bi-variate survey with a rating scale of 1-7, it refers to a gap score of 2.00 and above. See “bi-variate scale”.
Measures staff satisfaction to find out what can be done to improve organisational performance. Research shows high levels of staff satisfaction or employee engagement can lead to better customer experiences and improved productivity.
Learn more about staff surveys
This is a statement/question on a survey where respondents are typically asked to indicate their level of agreement or disagreement on a rating scale.
When a group of respondents have been surveyed too many times within a relatively short timeframe, resulting in a decreased response rate which affects robustness of survey data.
The individuals who you are aiming to survey.
The process of grouping open-ended feedback/qualitative responses together into common themes.
A typed copy of the response of a research participant, either edited (in which case it will be marked as an edited version) or unedited.
A survey item that employs scoring on only one scale such as level of agreement, as opposed to a bi-variate scale which may ask for a response by two scales such as “importance” and “performance”.
Comment/feedback that is relayed exactly as it was provided by the survey respondent.
Respondents have undertaken the survey of their own free will rather than it being mandatory.
The first paragraph within a survey that invites the participant to complete the survey, explains its purpose and outlines confidentiality.
Talk to us today about how Insync can help your organisation measure and improve employee engagement, customer engagement and organisational capability.
We use cookies to enhance your experience. Further use is considered consent. You can read more about cookies in our Privacy Policy.
You’ll always get a real person when you contact Insync.
Let's get started