MVP Phase

Phase Overview 

During the Minimum Viable Product (MVP) Phase of the Nittany AI Challenge (NAIC), participants will develop and deliver MVPs of the prototypes submitted within the previous phase.

At the conclusion of the MVP Phase, participating teams will deliver a seven–minute demonstration of their MVP and pitch for additional funding from the remaining pool of $25,000. Pitches must be accompanied by supporting written documentation providing contextual details related to the prototype and the proposed plan for continued development and/or implementation. Potential paths forward include discrete solutions to be shared freely with other organizations, entrepreneurial paths involving potential licensing or business development, or continued development through academic paths such as grants or formal research.   

Important dates for the MVP Phase of the Nittany AI Challenge include: 

  • April 15–16, 2020 — All teams will take part in facilitated workshops employing the Design Thinking methodology to help guide user-focused development. 
  • August 10, 2020 by 5:00 p.m. (ET) — All teams will provide access to their MVPs for internal review by the Nittany AI Alliance staff. These formative reviews will be followed by individual meetings with teams to provide last-minute guidance and feedback.   
  • September 10, 2020 between 1:00 p.m. and 5:00 p.m. (ET) — Live pitch of the MVPs to reviewers at the HUB-Robeson Center on the Penn State University Park campus, followed by an evening networking and celebration event. Remote presentation options will not be available for this event. 

The pitch, lasting no longer than seven minutes, should be focused on a demonstration of the MVP and request for continued funding to a group of approximately 25 reviewers. These reviewers will include:

  • subject-matter experts related to the challenges the solution addresses 
  • technical experts fluent in AI/ML capabilities 
  • industry representatives 
  • key decision makers from Penn State 

Teams should have representatives present that are capable of answering questions from each of the reviewers. 

MVP Documentation Requirements

Required written documentation should include, but is not limited to, an overview of the problem and solution, a development timeline, team qualifications, data collection/use plans, business plan (if relevant), letters of support, and additional user-experience and interface-development plans. The supporting documentation in this round is not limited in length, though teams are cautioned to limit the materials to directly address the review criteria described below. Typically, this documentation falls in the range of a single page executive summary followed by three to eight pages, depending on the number of screenshots/graphics used.  

Cover Page 

A one-page section that includes: 

  • title of project 
  • name, campus, and college of each team member 
  • name, email, and phone number of the primary point of contact 

Executive Summary 

A one-page section that quickly summarizes (often in bullet points): 

  • title of project 
  • problem statement 
  • very brief description of the solution 
  • brief description of future development and implementation plans 
  • overview of the financial ask 

Overview of Project Idea 

Documentation can include an overview of the problem your solution addresses and how it is being addressed. If appropriate, this section can be a copy of materials and descriptions contained within the Prototype Phase documentation. Most reviewers in this phase will not have participated as reviewers in the Prototype Phase, so this may be the first time they are seeing the MVP. 

Development Timeline 

Provide a timeline of major future development and implementation milestones.  

Technology 

Documentation can include a detailed technical description of the approach the team used to achieve its goal, including the ways in which the selected AI platforms are used. Proposals should include a description of the data and/or training of the AI necessary within the solution to provide a better sense of additional resources needed to implement the idea.  

Data Sources 

In this section, detail the data sources leveraged within the MVP as well as the data sources necessary for the continued development and successful implementation of the tool. If available, please detail the location and availability of the data sources and/or the plan for collecting the necessary additional data.  

Team Capabilities and Biographies 

MVP documentation should include a section describing the detailed capabilities of the team to implement the proposed solution. At a minimum, teams should include individuals with technical expertise necessary for development of the tool and content experts with knowledge of the domain being addressed. Depending on the proposed path forward for the solution, additional expertise may also be necessary (e.g. marketing, business, graphic design, UI/UX). 

Budget/Breakdown of Ask 

Provide a basic breakdown of your requested level of funding. 

Criteria for Review 

The following criteria will be used in the selection process for the MVP Phase of the Nittany AI Challenge. It would be advantageous to align the MVP proposal with the items listed below, ensuring that each of the guiding questions is addressed.  

Impact (10 points) 

  • Does the MVP have the potential for significant breadth and/or depth of impact? 
  • Does the MVP demonstrate a potential for long-term impact? 
  • Does the MVP have potential applications beyond those proposed? 

Feasibility (10 points) 

  • Given extant constraints (FERPA, technology, data access, Penn State processes, etc.), can this MVP realistically be implemented? 
  • Are requisite supports (e.g. data sources) available for further development/execution of the idea? 

Implementation and Scaling (10 points)

  • Does the solution’s technical approach allow for its eventual use at scale? 
  • Does the solution’s functionality and UX approach possess the potential to scale to meet the needs of a large user base?  

Technical Sophistication (10 points) 

  • Does the MVP reflect technical proficiency? 
  • Does the MVP provide sufficient functionality to provide value to the end user? 
  • Does the MVP function as intended and described? 

Use of AI and ML Technologies (10 points) 

  • Does the MVP leverage AI/ML capabilities in a significant and/or meaningful way? 
  • Does the MVP leverage all appropriate AI/ML capabilities aligned with its intended functionality?  

Interface Design (10 points) 

  • Does the MVP reflect a well-designed user interface? 
  • Does the written documentation reflect a cogent plan for addressing consumability of the tool?