Starter Guide: How to get the most out of your AI copilot

Hands-on guide on how to use AI to improve code quality, automate repetitive tasks, and optimize queries - with code and prompt examples.

June 19, 2024
A reading icon
2
 min read
Starter Guide: How to get the most out of your AI copilot

Integrating an AI copilot like DinoAI into your daily workflow, you can significantly boost productivity, enhance code quality, and streamline collaboration. This hands-on guide offers practical examples on how to automate repetitive tasks, and optimize queries.

By leveraging an AI copilot, you can focus on more strategic (and fun!) tasks, ultimately making your work more efficient and enjoyable.

1. Automate repetitive tasks

Automating repetitive tasks is essential for improving productivity and reducing errors. Using SQL and an AI copilot, you can streamline data cleaning processes, ensuring data quality and consistency.

Example: Automating data cleaning with SQL

Step 1: Loading data

-- Load data from a CSV file into a SQL table
COPY my_table FROM '/path/to/data.csv' DELIMITER ',' CSV HEADER;

Step 2: Data cleaning with AI suggestions

-- Copilot Suggestion for Handling Missing Values
UPDATE my_table SET column_name = (SELECT AVG(column_name) FROM my_table) WHERE column_name IS NULL;

-- Additional Copilot Suggestions for Data Cleaning
DELETE FROM my_table WHERE id IN (SELECT id FROM my_table GROUP BY id HAVING COUNT(*) > 1);
DELETE FROM my_table WHERE column_name <= 0;

AI prompt example for extra support

I have a SQL table that I need to clean. Can you provide SQL queries to:
1. Load data from a CSV file into the table.
2. Fill missing values in a specific column with the column's average.
3. Remove duplicate rows.
4. Delete rows where the values in a specific column are less than or equal to zero.

2. Improve code quality

Improving code quality involves optimizing SQL queries and dbt™ models to ensure they are efficient and maintainable. AI copilots can suggest enhancements that make your code more robust and performant.

Example: Code review and optimization

Step 1: Initial dbt™ model

-- models/calculate_mean.sql
WITH data AS (
    SELECT value
    FROM raw_table
)
SELECT
    SUM(value) / COUNT(value) AS mean_value
FROM data;

Step 2: AI copilot optimization suggestion

-- Copilot Suggestion for Optimization
WITH data AS (
    SELECT value
    FROM raw_table
)
SELECT
    AVG(value) AS mean_value
FROM data;

AI prompt example for extra support

I have a dbt model that calculates the mean of a column from a table:
WITH data AS (
    SELECT value
    FROM raw_table
)
SELECT
    SUM(value) / COUNT(value) AS mean_value
FROM data;

Can you optimize this dbt model to use more efficient SQL functions?

3. Facilitate collaboration

Encouraging team collaboration involves documenting your dbt™ models and SQL scripts to make them understandable and maintainable by all team members. AI copilots can assist in generating comprehensive documentation.

Example: Using AI to document dbt™ models automatically

Step 1: Initial dbt™ model without documentation

-- models/process_data.sql
WITH processed_data AS (
    SELECT
        id,
        value * 2 AS doubled_value
    FROM raw_table
)
SELECT * FROM processed_data;

Step 2: AI copilot generated documentation

# models/schema.yml
version: 2

models:
  - name: process_data
    description: "Processes the input data by doubling each value."
    columns:
      - name: id
        description: "The unique identifier for each record."
      - name: doubled_value
        description: "The value doubled from the original input."

AI prompt example for extra support

I have a dbt model that processes data by doubling each value:
WITH processed_data AS (
    SELECT
        id,
        value * 2 AS doubled_value
    FROM raw_table
)
SELECT * FROM processed_data;

Can you generate detailed documentation for this model, including descriptions for the model and its columns in a schema.yml file?

4. Enhance efficiency metrics

Tracking productivity metrics helps in measuring the effectiveness of your workflow and identifying areas for improvement. AI copilots can assist in monitoring these metrics accurately.

Example: Tracking productivity metrics with SQL integration

Step 1: Define metrics in your SQL workflow

-- Metric: Time to complete a query execution
CREATE TABLE query_log (
    query_id INT,
    start_time TIMESTAMP,
    end_time TIMESTAMP,
    execution_time AS (end_time - start_time)
);

-- Insert sample data
INSERT INTO query_log (query_id, start_time, end_time) VALUES
(1, '2023-06-14 10:00:00', '2023-06-14 10:01:00'),
(2, '2023-06-14 10:05:00', '2023-06-14 10:06:30');

Step 2: AI Copilot enhanced monitoring

-- AI Copilot Suggestion for Enhanced Monitoring
CREATE PROCEDURE monitor_query_execution(query_id INT, query TEXT)
BEGIN
    DECLARE start_time TIMESTAMP;
    DECLARE end_time TIMESTAMP;

    SET start_time = CURRENT_TIMESTAMP();
    EXECUTE IMMEDIATE query;
    SET end_time = CURRENT_TIMESTAMP();

    INSERT INTO query_log (query_id, start_time, end_time)
    VALUES (query_id, start_time, end_time);
END;

-- Example usage
CALL monitor_query_execution(3, 'SELECT * FROM my_table');

AI prompt example for extra support

I need to measure the time taken to complete SQL query executions. Can you provide SQL scripts to:
1. Create a table to log the start and end times of query executions and calculate the execution time.
2. Create a procedure to monitor query execution, log the times, and calculate the execution time for any given query.

5. Continuous learning and adaptation

Continuous learning and adaptation are critical for staying updated with the latest techniques and tools. AI copilots can assist in training and evaluating machine learning models using dbt™.

Example: Training and evaluating models

Step 1: Define and train a model

-- models/train_model.sql
WITH training_data AS (
    SELECT *
    FROM raw_table
    WHERE split = 'train'
),
model AS (
    SELECT
        id,
        value,
        value * 2 AS predicted_value
    FROM training_data
)
SELECT * FROM model

Step 2: AI copilot enhanced model training and evaluation

-- models/evaluate_model.sql
WITH test_data AS (
    SELECT *
    FROM raw_table
    WHERE split = 'test'
),
predictions AS (
    SELECT
        id,
        value,
        value * 2 AS predicted_value
    FROM test_data
),
evaluation AS (
    SELECT
        AVG(ABS(value - predicted_value)) AS mean_absolute_error
    FROM predictions
)
SELECT * FROM evaluation;

AI prompt example for extra support

I need to train and evaluate a predictive model using dbt. Can you provide dbt models that include:
1. Training the model on training data by doubling the value column.
2. Evaluating the model's performance on test data using mean absolute error (MAE) as the metric.

How are you using your copilot? Tell us! We’d love to know and learn from all of out users.

Interested to learn more?
Try out the free 14-days trial
Close Cookie Preference Manager
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
Strictly Necessary (Always Active)
Cookies required to enable basic website functionality.
Oops! Something went wrong while submitting the form.