During the Prototype Phase of the Nittany AI Challenge, participants will develop prototypes of the ideas submitted within the Idea Phase.
At the conclusion of the Prototype Phase, participating teams will deliver a 5-minute video demonstration of their prototype to a panel of judges. The video must be accompanied by supporting written documentation providing contextual details related to the prototype and the proposed development plan for the final Minimum Viable Product (MVP) Phase of the Challenge.
Important dates for the Prototype Phase of the Nittany AI Challenge include:
- Wednesday, March 15, 2023, by noon (ET) — Deadline to upload video link and written documentation.
- Friday, March 31, 2023 — Prototype winners will be announced.
The reviewers will include:
- content or domain experts related to the challenges the prototype addresses
- technical experts fluent in AI/ML capabilities
- industry representatives
- key decision makers from Penn State and Penn State World Campus
Panelists will review the video overviews and written documentation and provide questions to the teams in an online discussion group. Teams will be responsible for responding to all questions posed by reviewers within a set period prior to the submission of the final reviews.
By noon on Wednesday, March 15, all teams submitting a prototype for review are required to submit video demonstrations of their working prototypes. The videos must:
- be no more than 5 minutes in length
- explain the intent, goals, and potential impact of the solution
- demonstrate the basic, working functionality of the prototype
- be available through a YouTube link accessible for viewing by the Challenge reviewers
The production value of the videos will not be factored into the review, but they must clearly and accurately represent the prototype functionality. To help, Media Commons at Penn State provides free One Button Studio options throughout the Commonwealth.
The written documentation should include, but is not limited to, an overview of the solution, a development timeline, team qualifications, data collection/use plans, and user experience and interface development plans. In contrast with the first round, the supporting documentation in this round is not limited in length, though teams are cautioned to limit the materials to directly address the review criteria described below. Typically, this documentation falls in the range of four to six pages, depending on the number of screenshots and graphics used.
You can download a basic template for the documentation. Feel free to deviate from this template to make it your own. It is provided only to help you get started.
The recommended format includes:
A one-page section that includes:
- title of project
- name, campus, and college of each team member
- name, email, and phone number of the primary point of contact
Problem Statement and Project Overview
Documentation should include an overview of the problem and the method used to address that problem.
Provide a sample use case for the tool. This is most commonly a description of a typical user of the tool and an accompanying description of how the tool, once fully developed, would be used.
Provide a timeline of major development milestones. This timeline should conclude with the delivery of an MVP on Tuesday, August 10, though teams are welcome to include development milestones beyond that date.
Project documentation should include a detailed technical description of the approach the team will use to achieve its proposed goal, including the ways in which the selected AI platforms are used within the prototype and how the team anticipates using those and other services in the MVP. Specifically, the documentation should include a list of the components of the selected AI platforms that are leveraged in the prototype, any additional components that may be leveraged in the development of the MVP, and additional services that may be necessary for continued development. This section should also include broad explanations of technological approaches and development pathways toward the final development of a minimum viable product. Proposals should include a description of the data and/or training of the AI necessary within the solution to provide a better sense of additional resources needed to implement the idea.
In this section, detail the data sources leveraged within the prototype as well as the data sources necessary for the continued development and successful implementation of the tool. If available, please detail the location and availability of the data sources and/or the plan for collecting the necessary data. Remember that while we can provide some assistance with finding data sources, finding and gaining access to those sources is the team’s responsibility.
Prototype documentation should include a section describing the detailed capabilities of the team to implement the proposed solution. At a minimum, teams should include individuals with technical expertise necessary for development of the tool, content experts with knowledge of the domain being addressed, and individuals with UX skills to address the solution’s usability.
While significant detail related to the tool’s eventual UI elements and UX approach are not necessary at this phase, an overview of the approach for addressing these critical issues will be helpful for reviewers. A description of any planned user testing and its associated timeline would also be beneficial.
Criteria for Review
The following criteria will be used in the selection process for the Prototype Phase of the Nittany AI Challenge. It would be advantageous to align the proposal with the items listed below, ensuring that each of the guiding questions is addressed. View the full rubric (PDF).
Impact (10 points)
- Does the prototype have the potential for significant breadth and/or depth of impact?
- Does the prototype demonstrate a potential for long-term impact?
- Does the idea shown in the prototype have potential applications beyond those proposed?
Feasibility (10 points)
- Can the proposal be realistically completed in the time allotted?
- Can the available technology support the execution of the proposed idea?
- Are requisite supports (e.g. data sources) available for the execution of the idea?
Implementation and Scaling (10 points)
- Does the solution’s current development approach allow for its eventual use at scale?
- Does this solution’s functionality and UX approach possess the potential to scale to meet the needs of a large user base?
Team Capabilities (10 points)
- Is the team capable of executing the proposed idea?
- Does the team have the technical expertise to successfully complete an MVP based on their idea?
- Does the team have the content/domain expertise to successfully complete an MVP based on their idea?
Technical Sophistication (10 points)
- Does the prototype reflect technical proficiency?
- Does the prototype provide sufficient functionality to demonstrate a proof of concept?
- Does the prototype function as intended and described?
Use of AI and ML Technologies (10 points)
- Does the prototype leverage AI/ML capabilities in a significant and/or meaningful way?
- Does the prototype leverage all appropriate AI/ML capabilities aligned with its intended functionality?
Use of Available Data (5 points)
- Does the idea leverage existing data sources in a meaningful way?
- Is the plan for leveraging existing data sources reasonable and sufficient?
Interface Design Plans/Consumability (5 points)
- Is there a sufficient plan for development of the user interface design?
- Does the written documentation reflect a cogent plan for addressing consumability of the tool?