Cortex Skills
  • Skills Directory
  • Categories
  • Docs
Add SkillBrowse Skills
  1. Home
  2. Skills Directory
  3. Cortex Analyst
  4. snowpark-connect
IntermediateCortex Analyst

snowpark-connect

|
Jump to Installation

Compatible with

Cortex

About this skill

Snowpark Connect

Skills for working with Snowpark Connect for Spark (SCOS) on Snowflake.

When to Use

  • User wants to migrate PySpark or Databricks code to Snowflake
  • User asks about SCOS or Snowpark Connect compatibility
  • User wants to validate a completed SCOS migration
  • User mentions "spark connect", "scos", or "snowpark connect"

Intent Detection

Determine which sub-skill to load based on user intent:

Start
  ↓
Analyze User Request
  ↓
  ├─→ Migration intent → Load migrate-pyspark-to-snowpark-connect/SKILL.md
  │     (convert, migrate, update imports, rewrite for SCOS)
  │
  └─→ Validation intent → Load validate-pyspark-to-snowpark-connect/SKILL.md
        (validate, verify, check migration, test compatibility)

Route: Migrate PySpark to Snowpark Connect

If user wants to migrate or convert PySpark code:

  • Keywords: migrate, convert, rewrite, update imports, move to SCOS
  • Load migrate-pyspark-to-snowpark-connect/SKILL.md
  • Follow the migration workflow

Route: Validate a Migration

If user wants to validate or verify a completed migration:

  • Keywords: validate, verify, check, test, review migration
  • Load validate-pyspark-to-snowpark-connect/SKILL.md
  • Follow the validation workflow

Stopping Points

None — this skill routes to sub-skills. Stopping points are defined within each sub-skill.

Output

Output is determined by the loaded sub-skill:

  • Migration: Migrated _scos files with compatibility fixes and migration headers
  • Validation: Validation report with pass/fail status for each check

Installation

Install using the Cortex CLI:

bash
$cortex skill install snowpark-connect

Requires Cortex CLI v2.0+. Install guide →

Repository Sources

Loading repository files...

SKILL.md Preview

SKILL.md
---
name: snowpark-connect
description: |
  Snowpark Connect (SCOS) skills for migrating and validating PySpark workloads on Snowflake.
  Use when: migrating PySpark to Snowpark Connect, validating SCOS migrations,
  analyzing Spark compatibility, or working with Snowpark Connect for Spark.
  Triggers: snowpark connect, scos, pyspark migration, spark connect,
  validate migration, pyspark compatibility.
---

# Snowpark Connect

Skills for working with Snowpark Connect for Spark (SCOS) on Snowflake.

## When to Use

- User wants to migrate PySpark or Databricks code to Snowflake
- User asks about SCOS or Snowpark Connect compatibility
- User wants to validate a completed SCOS migration
- User mentions "spark connect", "scos", or "snowpark connect"

## Intent Detection

Determine which sub-skill to load based on user intent:

```
Start
  ↓
Analyze User Request
  ↓
  ├─→ Migration intent → Load migrate-pyspark-to-snowpark-connect/SKILL.md
  │     (convert, migrate, update imports, rewrite for SCOS)
  │
  └─→ Validation intent → Load validate-pyspark-to-snowpark-connect/SKILL.md
        (validate, verify, check migration, test compatibility)
```

### Route: Migrate PySpark to Snowpark Connect

**If user wants to migrate or convert PySpark code:**
- Keywords: migrate, convert, rewrite, update imports, move to SCOS
- **Load** `migrate-pyspark-to-snowpark-connect/SKILL.md`
- Follow the migration workflow

### Route: Validate a Migration

**If user wants to validate or verify a completed migration:**
- Keywords: validate, verify, check, test, review migration
- **Load** `validate-pyspark-to-snowpark-connect/SKILL.md`
- Follow the validation workflow

## Stopping Points

None — this skill routes to sub-skills. Stopping points are defined within each sub-skill.

## Output

Output is determined by the loaded sub-skill:
- **Migration**: Migrated `_scos` files with compatibility fixes and migration headers
- **Validation**: Validation report with pass/fail status for each check

Tags

#snowpark-connect

Related Skills

Browse all
v1.0.0

semantic-view

cortex-community
[REQUIRED] Use for ALL requests that mention: create, build, debug, fix, troubleshoot, optimize, improve, or analyze a semantic view — AND for requests about VQR suggestions, verified queries, verified query representations, or seeding/generating queries for a semantic view. This is the REQUIRED entry point - even if the request seems simple. DO NOT attempt to create, debug, or generate VQR suggestions for semantic views manually - always invoke this skill first. This skill guides users through creation, setup, auditing, VQR suggestion generation, and SQL generation debugging workflows for semantic views with Cortex Analyst.
#semantic-view
Cortex
4033.7k
2 hours ago
v1.0.0

cortex-code-guide

cortex-community
Complete reference guide for Cortex Code (CoCo) CLI. Use when: learning cortex features, understanding commands, troubleshooting setup, exploring Snowflake tools, managing sessions, configuring agents, keyboard shortcuts, MCP integration. Triggers: how to use cortex, cortex guide, cortex help, cortex commands, getting started, snowflake tools, #table syntax, subagents, sessions, resume, fork, rewind, compact, /agents, configuration.
#cortex-code-guide
Cortex
1953.9k
2 hours ago
v1.0.0

internal-marketplace-org-listing

cortex-community
#data-products
Cortex
1664.4k
2 hours ago
v1.0.0

data-quality

cortex-community
trending
Schema-level data quality monitoring, table comparison, dataset popularity analysis, and ad-hoc column quality assessment using Snowflake Data Metric Functions (DMFs) and Access History. Use when user asks about: data quality, schema health, DMF results, quality score, trust my data, quality regression, quality trends, SLA alerting, data metric functions, failing metrics, quality issues, compare tables, data diff, validate migration, table comparison, popular tables, most used tables, unused data, dataset usage, table popularity, listing quality, listing health, listing freshness, provider data quality, consumer data quality, one-time quality check, quick quality scan, check data quality without DMFs, recommend monitors, what should I monitor, DQ coverage gaps, unmonitored tables, DMF coverage report, monitoring health, noisy monitors, silent monitors, misconfigured monitors, DMF cost optimization, investigate DQ incident, why did freshness drop, why did row count drop, correlate violation, multi-dimensional root cause, circuit breaker, pause pipeline on violation, halt bad data propagation, custom DMF, format validation DMF, email format check, value range check, referential integrity DMF, DMF expectations, set threshold, tune DMF threshold, DMF expectation management, attach DMFs, set up DMFs for first time, DMF setup wizard, accepted values, ACCEPTED_VALUES, validate column values, allowed values check, value in set, categorical validation.
#data-quality
Cortex
98313
2 hours ago

Details

Stars446
Installs5.0k
Authorcortex-community
Versionv1.0.0
Updated2 hours ago
LicenseMIT
View Repository
quick install
$ cortex skill install snowpark-connect