kapiko.ai
Music Genre Analytics

😒 Sad

😒 Sad Music Genre Analysis: Audio Features, AI Prompts & Production Profile

Melancholic music for emotional reflection, cathartic listening, and rainy day contemplation β€’ 100 tracks analyzed β€’ Spotify audio features dataset
Median BPM
87
Laid-back groove
Avg Energy
33%
Mellow vibe
Avg Valence
29%
Introspective
Top Key
G#/Ab
Major 13 | Minor 3
Major/Minor
55/45
Balanced mood

A data-driven breakdown of 😒 sad music based on Spotify audio features and Gemini AI analysis of the top 100 tracks. Use these insights to understand what makes 😒 sad music sound the way it does β€” and to generate your own.

Jump to Prompt Lab ↓ Download Data (JSON) ↓

Genre Profile

Sad music is characterized by moderate energy, balancing movement with space, with a median energy of 39.3%. The genre carries a blend of acoustic and electronic elements (64.0% acousticness). Instrumentalness sits at 0.0%, while danceability registers at 73.2% β€” making it highly danceable. The emotional tone is emotionally balanced, neither overtly happy nor sad, with valence at 40.3%. Speechiness is minimal at 10.0%.

The typical sad track moves at a moderate tempo that sits comfortably in walking-pace territory of 115.0 BPM (Β±23.2). Tonally, G is the most common key (17 of 100 tracks), and 55% of tracks are in a major key β€” showing a fairly even split between light and dark tonalities.

The genre's sonic identity is shaped by artists like Shiloh Dynasty, Kina, iamjakehill alongside SnΓΈw, itssvd. The typical track runs about 2.3 minutes, optimized for streaming attention spans.

Production-wise, sad sits at a median loudness of -10.5 dB β€” relatively quiet compared to mainstream genres, preserving dynamic range. Whether you're producing in the genre or analyzing it for AI music generation, these numbers provide a precise target for capturing the authentic sad sound.

Prompt Lab

How to Prompt a Hit

Transform lo-fi hip hop's laid-back groove into AI prompts. These data-driven insights help you craft the perfect chill beats β€” from Suno's warm analog vibes to Udio's crisp vinyl textures.

Suno
Excels at boom-bap rhythms and vintage textures. Great for J Dilla-inspired swing patterns.
Udio
Exceptional drum quality and sample clarity. Perfect for clean, professional lo-fi beats.
Stable Audio
Open-source flexibility for experimental lo-fi. Good for extended chill-hop sessions.
MusicGen (Meta)
Strong at jazz-influenced chord progressions and organic sample textures.
Riffusion
Real-time beat generation. Ideal for live lo-fi streaming and spontaneous creativity.
Feature Translator
BPM 101-133: melancholic pace
Energy 32-52%: melancholic,  reflective
Valence 32-52%: melancholic, reflective, cathartic, lonely
Danceability 66-76%: sad groove
Acousticness 41-61%: piano textures
Instrumentalness 1-21%: Focus on piano
Speechiness 16-26%: Clean sad passages
Tempo: melancholic to moderate
Key preference: G, C#/Db, warm keys
Prompt Template
Create a sad track with:

β€’ piano foundation (114 BPM)
β€’  melancholic-inspired  strings
β€’ melancholic piano patterns
β€’  soft guitar  ambient pads
β€’  reflective production style
β€’  tearjerker sound design
β€’ Moderate energy (42%), melancholic mood (42%)
β€’ sad arrangement (11%), piano elements (51%)

Artists to reference: Shiloh Dynasty, Kina, iamjakehill, SnΓΈw, itssvd, SadBoyProlific

Duration: 3-4 minutes, perfect for  lonely listening
Genre Recipe JSON
{
  "genre": "sad",
  "audio_features": {
    "bpm": {"min": 62, "max": 181, "median": 114},
    "energy": {"avg": 0.423, "range": "melancholic"},
    "valence": {"avg": 0.425, "range": "melancholic, reflective, cathartic, lonely"},
    "danceability": {"avg": 0.719, "range": "sad groove"},
    "acousticness": {"avg": 0.517, "range": "piano/organic"},
    "instrumentalness": {"avg": 0.116, "range": "focus on piano"},
    "key_preference": ["G", "C#/Db", "G#/Ab"],
    "mode_preference": {"major": 55.0, "minor": 45.0}
  },
  "production_style": {
    "instruments": ["piano", "strings", "soft guitar", "ambient pads", "vulnerable vocals"],
    "style_tags": ["sad", "melancholic", "tearjerker", "heartbreak", "somber"],
    "mood_descriptors": ["melancholic", "reflective", "cathartic", "lonely"],
    "tempo_category": "melancholic_to_moderate"
  },
  "reference_artists": ["Shiloh Dynasty", "Kina", "iamjakehill", "Sn\u00f8w", "itssvd", "SadBoyProlific", "Powfu", "347aidan"],
  "track_characteristics": {
    "typical_length": "3-4 minutes",
    "listening_context": " lonely listening",
    "production_focus": "piano foundation"
  }
}}

Audio DNA

Key finding: Six audio features define sad's fingerprint: Danceability leads at 73.2%, while Instrumentalness sits at just 0.0% β€” a genre defined as much by what it lacks as what it contains.

Feature Summary
Lo-fi hip hop's signature sound profile: highly acoustic (63%) and instrumental (76%), with moderate danceability (62%) and subtle energy (33%). The genre maintains its introspective character through low valence (29%) while preserving enough rhythmic elements to keep listeners engaged.
Energy
33%
Valence
29%
Danceability
62%
Acousticness
63%
Instrumentalness
76%
Speechiness
5%

Rhythm & Tonality

Key finding: 55% of sad tracks are in a major key, with G the most common. Typical BPM: 115.0 (Οƒ 23.2).

BPM Distribution
Lo-fi hip hop centers around the 60-90 BPM sweet spot (61%), creating that signature laid-back groove. The median of 87 BPM perfectly captures the unhurried, contemplative pace that defines the genre.
Key Distribution
G#/Ab emerges as the dominant key (16%), followed by C#/Db and A#/Bb (11% each). This preference for flat keys contributes to the genre's warm, slightly detuned character.
Major vs Minor Mode
Surprisingly balanced at 55% major, 45% minor. While minor modes provide melancholic undertones, major keys add brightness and hope to lo-fi's introspective landscape.
Duration Histogram
Most tracks fall between 2-4 minutes (65%), with a median of 2.7 minutes. This brevity aligns with lo-fi's purpose as background music and loop-friendly content.

Emotional Fingerprint

Energy vs Valence
Lo-fi hip hop occupies the contemplative low-energy, low-valence quadrant. The target zone annotation shows the genre's sweet spot: mellow but not depressing, introspective but not aggressive.
Acousticness vs Instrumentalness
High concentrations in the upper-right quadrant reveal lo-fi's preference for organic, instrumental sounds. Most tracks blend acoustic elements with instrumental arrangements, perfect for studying or relaxation.

Top Artists

Key finding: Shiloh Dynasty dominates with 12 tracks in the top 100, followed by Kina (6) and iamjakehill (6).

Most Featured Artists
Emancipator and Otaku lead with 6 tracks each, followed by Bonobo (5). These artists represent different facets of lo-fi: Emancipator's downtempo electronica, Otaku's anime-inspired beats, and Bonobo's trip-hop influences.

What Makes a Hit

Popular vs Unpopular Comparison
Top 25 tracks show higher energy (33% vs 20%) and slightly more danceable rhythms. Successful lo-fi maintains the genre's core characteristics while adding just enough movement to keep listeners engaged.
Outlier Spotlight
Tracks that break conventional lo-fi boundaries while maintaining genre appeal.
My Kind of Woman
Mac DeMarco
Highest energy (77%) - brings indie rock energy to lo-fi aesthetics
Glimpse of Us
Joji
Fastest tempo (170 BPM) - modern R&B pace in lo-fi arrangement
Chamber Of Reflection
Mac DeMarco
Highest valence (51%) - uplifting mood breaks melancholic trend
Affection
Jinsang
Highest valence in pure instrumental (61%) - joy through melody alone
Novacane
Frank Ocean
Lowest acousticness (6%) - electronic production in organic-leaning genre
Shoreditch
Clint Is Quinn
Lowest energy (11%) - extreme minimalism pushes ambient boundaries

Feature Correlations

Correlation Heatmap
Strong negative correlation between energy and acousticness (-0.64) reflects lo-fi's preference for organic sounds over electronic energy. Positive energy-valence correlation (0.42) suggests livelier tracks tend toward brightness.

Production Profile

Soul Production Characteristics
The genre's production DNA emphasizes vintage textures, chopped samples, and analog warmth. Vinyl crackle and tape saturation are nearly universal, while J Dilla-style swing defines the rhythmic foundation.

Top Tracks

Key finding: The most popular sad track is “death bed (coffee for your head)” by Powfu with a popularity score of 83.

Soul Essential Tracks
From nostalgic classics to modern interpretations, these tracks define lo-fi hip hop's emotional and sonic landscape. Notice the prevalence of introspective themes and collaborative artists.
# Track Artist Popularity BPM Energy Valence Key

Frequently Asked Questions

What BPM is Sad music?+
Based on analysis of the top 100 sad tracks on Spotify, the median BPM is 115.0 with a standard deviation of 23.2. The typical range falls between 102.0 and 133.2 BPM.
What key is Sad music usually in?+
The most common key in sad music is G, and 55% of tracks are in a major key.
How do I make Sad music with AI?+
Use AI music generators like Suno or Udio with genre-specific prompts. Key parameters for sad: BPM around 115.0, energy level around 39.3%, and valence around 40.3%. Visit the Prompt Lab section on this page for a ready-to-copy prompt template.
What instruments are used in Sad music?+
Sad music typically features a mix of acoustic and electronic elements β€” live instruments blended with digital production. With an acousticness of 64.0% and instrumentalness of 0.0%, the genre sits in a hybrid space (64.0% acousticness, 0.0% instrumentalness), combining organic and electronic sounds.
Is Sad music happy or sad?+
With a median valence of 40.3%, sad music is emotionally balanced, sitting between sad and happy. 55% of tracks use major keys, which provides a sense of balance and emotional complexity.
Electronic Ambient Pop Indie

Sources & Methodology

This analysis is based on Spotify Audio Features API data for the top 100 😒 sad tracks by popularity, supplemented by Gemini AI audio analysis of 30-second preview clips.

Audio features (energy, valence, acousticness, instrumentalness, danceability, speechiness, tempo, key, mode, loudness, duration) are sourced directly from Spotify's audio analysis pipeline. Production insights, mood classifications, and instrumentation details are generated by Gemini AI.

Data was collected and analyzed by kapiko β€” a music analytics platform for AI-era music production.