熊猫在线视频

Data Privacy and Security With AI

Students, faculty, and staff cannot share non-public institutional information with AI models, like the chat boxes offered by popular large language models (e.g., ChatGPT). Sensitive data include, but are not limited to,

  • Protected health information (PHI)
  • Student work and personally identifiable student information (FERPA-protected information)
  • Personnel files and financial information
  • IRB-controlled research data
  • Any information that can be used to trace a person’s identity such as name, address, phone number, date of birth or social security number (PII: Personally Identifiable Information))

Members of Endicott who would like to use non-public institutional data in AI applications should consult with Academic Technology and IT first to find institutionally licensed or managed solutions.

Faculty Guidelines for AI use

Experiential learning at Endicott depends on human connections: while AI may augment teaching practices, AI should not replace instructor grading and human feedback. Faculty retain discretion over defining the acceptable role of AI in coursework and determining how it may influence grades, participation, and assessments.

At the start of each course, faculty members are encouraged to clearly outline the expectations for AI usage in that specific class, including:

  • Specific assignments or tasks where AI is permitted or prohibited.
  • The potential academic integrity consequences of misuse or unapproved AI usage.
  • If AI is permitted in the class, the following should be made clear:
    • Approved AI tools or platforms.
    • How AI usage should be disclosed (e.g., citations, statements of use).
    • The cost of any required AI tools.

Student Guidelines for AI use

Students are expected to use AI tools ethically and responsibly, adhering to the following principles:

  • Faculty retain discretion over defining the acceptable role of AI in coursework and determining how it may influence grades, participation, and assessments. As a result, students are expected to follow the guidelines established in each course syllabus and to understand the potential academic integrity consequences of misuse or unapproved AI usage.
  • AI should supplement learning and not replace the student's original thought, analysis, or effort.
  • Students must disclose AI involvement when creating content, solving problems, or conducting research, as outlined by course-specific guidelines.
  • Misrepresentation of AI-generated work as entirely one’s own is considered academic dishonesty and will be subject to disciplinary action under the College’s academic integrity policy.

AI Misuse and Consequences

Misuse of AI tools, including failure to adhere to course-specific AI guidelines or attempting to gain an unfair advantage, may result in academic penalties. Faculty should clearly communicate the consequences of violations, which may include reduced grades, a failing grade on an assignment and/ or referral to the academic integrity committee.

As AI assistants are increasingly embedded in general purpose applications and workspaces it may not be obvious to users that they are using AI. For example, Grammarly depends on AI to offer writing assistance, which may be flagged by AI detection software. Therefore, it is important for faculty to clarify, and for students to check, which AI tools are and are not allowed on a class-by-class basis.

Following , and in order to uphold student privacy, consent, and data security, Endicott only offers faculty access to Turnitin’s AI detector tool; other detectors should not be consulted in suspected cases of academic misconduct. Moreover, Turnitin’s results should be considered alongside the totality of evidence and the broader context of the class, such as the student’s prior work and course-specific AI policies.

These guidelines ensure that AI use enhances academic achievement while fostering a culture of integrity, creativity, and accountability at 熊猫在线视频.

Support for Learning with AI

The College recognizes that not all students or faculty may be equally familiar with AI tools. Faculty are encouraged to provide reasonable support or guidance on AI literacy where applicable, such as training resources or workshops. Students are encouraged to seek assistance from faculty, the library, or the Division of Academic Success to ensure they understand the ethical and effective use of AI in their academic work.

Modifications and Exceptions

Students with disabilities may be allowed to use AI-assistive technologies for note-taking purposes if such tools are indicated by their documented accommodations. Accessibility services will work with faculty to ensure that the use of assistive technologies does not fundamentally alter the course.

Genio Notes is the only AI note-taking application currently approved for use in Endicott classrooms. Students using Genio Notes agree to terms of use that include not disseminating classroom recordings. For more information, please refer to the Center for Accessibility Services website.

AI Meeting Assistants

With the exception of Genio Notes, (above, for accommodations) and Zoom’s native recording features, AI meeting assistants that transcribe or otherwise record meetings are prohibited, including in all learning, research, and business contexts at Endicott. Zoom hosts are empowered to remove any unpermitted meeting assistants from their meetings (e.g., Otter.ai).