WE CREATE
LEARNING & ASSESSMENT
TOOLS
We are a development team based in Cincinnati, Ohio area.
We developed the Cognero Assessment Suite over a decade ago and work to continuously improve Cognero for our users.
We partner with leading education providers, educational publishers, and institutions to help improve teaching and learning in K-20 education.
Welcome to Cognero.
For more than a decade, we have remained committed to the unmatched support of our partners and to working in every way to thoughtfully elevate their work with the vibrant learning communities they serve, support, and inspire.
We are grateful to be a trusted teaching and learning partner across K-20 learning spaces–from third graders to graduate students–and we fully understand the gravity of our role and responsibilities to the institutions, instructors, and millions of learners who depend upon us.
Our work on innovative solutions, security, and interoperability continues.
We are incredibly proud to be recognized as a Certified 1EDTECH (formerly IMSGlobal) TrustED App and look forward to continuing our contributions to the robust community of learning solutions providers.
Please reach out for more information if you have any questions or for a demo if you would like to learn more.
Many thanks,
Christian Masterson, Arizona Stafford,
and the entire Cognero team!
Over a Decade of Award-Winning Work
We are also incredibly proud to be recognized as a Certified 1EDTECH (formerly IMSGlobal) TrustED App and look forward to continuing our contributions to the robust community of learning solutions providers.
Learn MoreENABLING MILLIONS
OF LEARNERS
For more than a decade, we have remained committed to the unmatched support of our partners and to working in every way to thoughtfully elevate their work with the vibrant learning communities they serve, support, and inspire.
Over A Decade of Experience
Our work on innovative teaching and learning solutions, security, and interoperability continues as it has for more than a decade. We are also incredibly proud to be recognized as a Certified 1EDTECH (formerly IMSGlobal) TrustED App and look forward to continuing our contributions to the robust community of learning solutions providers.
OUR CLIENTS
We are grateful to be a trusted teaching and learning partner across K-20 learning spaces–from kindergartners to graduate students–and we fully understand the gravity of our role and responsibilities to the institutions, instructors, and millions of learners who depend upon us.
TEACHING & LEARNING.
WITHOUT LIMITS!
Cognero's Built-in Question Types
Scroll to view each of the question types available in Cognero, an example of each type, and a short description.
- A statement that students must correctly identify as true or false
- Automatically graded in online tests
- Can contain answer-specific feedback
A statement that students must correctly identify as true or false. If the statement is false, students must also enter a replacement word or phrase that makes the statement true.
For example…
Q: Cincinnati is the capital of Ohio.
A: false, Columbus
- Automatically graded in online tests
- Can contain answer-specific feedback
- A question that students must correctly answer by choosing yes or no
- Automatically graded in online tests
- Can contain answer-specific feedback
- Can contain from 2 to 26 answer choices
- Can lock some or all answer choices from scrambling
(e.g., useful if choice D is “none of the above”) - Common in PARCC and Smarter Balanced assessments
- Can contain answer-specific feedback
- A question for which students must identify the one correct answer from a list of choices provided
- Automatically graded in online tests
- A question for which students must identify the one or more correct answers from a list of choices provided
- Can contain from 2 to 26 answer choices
- All correct answer choices must be identified by students to earn credit (no partial credit)
- Can lock some or all answer choices from scrambling
- Common in PARCC and Smarter Balanced assessments
- Can contain answer-specific feedback
- Automatically graded in online tests
- A question for which students must enter the correct numeric answer
- Answers may be set up as regular numbers or fractions.
- Students may be required to choose from a list of units (e.g., feet, inches, etc.) as part of answer.
- Student answer entry can be limited to numeric characters only, a certain number of decimal places, and a certain number of significant figures.
- Automatic scoring is customizable. If scoring is set to “arithmetic equivalency” students must provide an answer that is mathematically equivalent to the correct answer. For example, if the answer is “.5”, students could answer “0.5”, “.5”, “1/2”, “2/4”, etc. If scoring is set to “exact” students must enter the answer exactly as it is set in the content.
- Common in PARCC and Smarter Balanced assessments
- Can set to be automatically graded in online tests or to be graded manually
- A statement for which students must enter the correct text answer to fill in a blank
- Since completion questions are graded automatically in online tests, the answer is typically one or two words.
- When creating a question, more than one correct answer may be entered in the answer field. For example, answer may be set to “World War Two”, “World War II”, “World War 2” and any would be accepted as correct.
- Common in PARCC and Smarter Balanced assessments
- Automatically graded in online tests
- A question or statement containing one or more blanks into which students must enter the correct text answer(s)
- Since multi-blank questions are graded automatically in online tests, the answers are typically one or two words
- Answer blanks within a question may be set up to be scored in any order. For example, the answer for a question about the six types of simple machines may be set to allow for “lever”, “pulley”, “inclined plane”, “wheel and axle”, “wedge”, and “screw” in any order
- Alternatively, answer blanks within a question may be set up to be scored in a specific order. For example, the answer for a question about the steps of the writing process may be set to require “prewriting”, “writing”, “revising”, “editing” and “publishing” in that specific order
- Questions may be set up to allow for partial credit.
- Common in PARCC and Smarter Balanced assessments
- A set of questions for which students must choose the corresponding answer choices from a list
- A matching group can contain from 2 to 26 questions and answer choices.
- Automatically graded in online tests
- A question for which students must enter the correct text answer
- Since objective short answer questions are designed to be graded automatically on electronic tests, the answer is typically just a few words or a short phrase.
- When creating a question, more than one correct answer may be entered in the answer field. For example, the answer could be set to “World War Two”, “World War II”, “World War 2” and any would be accepted as correct.
- Automatically graded in online tests
- A question for which students must enter the correct text answer
- Answers are typically short (a phrase or sentence).
- Manually graded by instructor in online tests
- A question that can be displayed on a test as a multiple choice question, an objective short answer question, or a subjective short answer question
- Can easily be toggled between the three modes, allowing instructors to increase or decrease the difficulty level of a test with a single click of the mouse
- Automatically or manually graded depending on mode
- Can contain answer specific feedback
- A question for which students must identify the correct order for a series of items
- Can contain from 2 to 26 items
- Automatically graded in online tests
- A statement with which students rate their opinion or agreement
- Can contain from 2 to 11 choices
- Has built-in options (Strongly disagree, disagree, neutral, etc.) and can also be customized
- Do not have a correct answer
- Typically used for online surveys, not tests
- A question for which students must enter a text answer
- The answer to an essay question is typically one or more paragraphs that must be manually graded by the instructor.
- Can be set up with a scoring rubric that enables instructors to quickly and consistently grade student essays
- Common in PARCC and Smarter Balanced assessments
- A bin sort question is composed of draggable items, sorting bins, and optional background images and text.
- Draggable items can match one or more bins or none at all.
- Unlike Cognero’s drag-and-drop targets–which can only match one draggable item–sorting bins can have multiple draggable items that match.
- A question for which students must drag one or more items (text and/or images) to appropriate drop targets. For example, students may be asked to label a diagram of a cell by dragging text labels to their locations in the cell diagram.
-
Automatically graded in online-delivered quizzes, assignments, pre-tests, and exams.
-
Easy to author. Drag and Drop questions are created within Cognero’s intuitive WYSIWYG interface.
-
Compatible with mobile devices.
-
Common in PARCC and Smarter Balanced assessments
- Question content is based on HTML5 code, allowing for interactivity (drag and drop, graphing, etc.)
- Questions currently are coded outside of Cognero and imported into the system
- Compatible with mobile devices
- Common in PARCC and Smarter Balanced assessments
- Automatically graded in online tests