26

I'm looking for how to draw the sound waves according to music.

I want waves like this image

enter image description here

here is some discussion about displaying Waves from music

  1. WaveForm on IOS
  2. rendering a waveform on an iphone
  3. audio waveform visualisation with iPhone

Github Example Links

But not getting any idea about this type of wavefrom, is this possible to draw waves like this image?

Dale K
  • 25,246
  • 15
  • 42
  • 71
Vishal Khatri
  • 954
  • 2
  • 12
  • 32
  • 1
    Your image doesn't appear to have any relationship to an actual waveform. Where are you seeing that? –  Oct 31 '13 at 06:58
  • i want to display wavefrom like this. Please check this image, http://markhadleyuk.com/wp-content/uploads/2012/01/waveform-essentials-600.jpg – Vishal Khatri Nov 11 '13 at 09:50
  • There are no resources on how to generate a waveform like the ones in your images because they are fake. An audio waveform from a song doesn't look like that. The image in your OP looks like sine waves with a window function. The link in your comment _might_ be real audio data with a low-pass filter but if you are here asking how to do this that is way beyond you. Sorry. There is a plethora of information in the links you've posted and on the web. I don't understand what you want for an answer. – Radiodef Nov 12 '13 at 19:56
  • you can refer this http://stackoverflow.com/questions/5032775/drawing-waveform-with-avassetreader and can make changes in generating image code – S S Nov 13 '13 at 06:37

3 Answers3

4

Disclaimer: A lot of this has been discovered through trial and error, I may have some serious false assumptions in play here:

You would need to use the AudioUnits framework. When initialising the playback you can create an AURenderCallbackStruct. You can specify in this struct a playback callback function which provides you with a few arguments which will contain the information you need.

the callback function will have a signature like this:

static OSStatus recordingCallback (void *inRefCon,
                                   AudioUnitRenderActionFlags *ioActionFlags,
                                   const AudioTimeStamp *inTimeStamp,
                                   UInt32 inBusNumber,
                                   UInt32 inNumberFrames,
                                   AudioBufferList *ioData) 

In here there is an array of audio data which can be used for getting amplitude of the audio buffer for each frequency bin, or for calculating the DB value of the frequency bin.

I don't know what that graph is showing, but it looks to me like a smoothed display of the amplitudes of each of the sample bins.

Audio Units are not simple, but its worth playing with for a while until you get a grip.

Here is a skeleton of my callback function so you have more of a grasp as to what I mean:

EDIT: removed dead link, I've lost this code sorry

Community
  • 1
  • 1
JConway
  • 569
  • 5
  • 13
  • If your playing back media in realtime this is the correct answer, you can get the audio data being outputted through this callback. – keji Jun 13 '14 at 05:14
3

I, too have been trying sincerely for the last three months but I didn't find a solution. For the time being I used static images based on the type of song (static data songs). I added the images to a UIScrollView and changed the contentOffset based on the current position of the audio.

Robert
  • 5,735
  • 3
  • 40
  • 53
Tendulkar
  • 5,550
  • 2
  • 27
  • 53
1

A little bit refactoring from the above answers


import AVFoundation
import CoreGraphics
import Foundation
import UIKit

class WaveGenerator {
    private func readBuffer(_ audioUrl: URL) -> UnsafeBufferPointer<Float> {
        let file = try! AVAudioFile(forReading: audioUrl)

        let audioFormat = file.processingFormat
        let audioFrameCount = UInt32(file.length)
        guard let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)
        else { return UnsafeBufferPointer<Float>(_empty: ()) }
        do {
            try file.read(into: buffer)
        } catch {
            print(error)
        }

//        let floatArray = Array(UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength)))
        let floatArray = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))

        return floatArray
    }

    private func generateWaveImage(
        _ samples: UnsafeBufferPointer<Float>,
        _ imageSize: CGSize,
        _ strokeColor: UIColor,
        _ backgroundColor: UIColor
    ) -> UIImage? {
        let drawingRect = CGRect(origin: .zero, size: imageSize)

        UIGraphicsBeginImageContextWithOptions(imageSize, false, 0)

        let middleY = imageSize.height / 2

        guard let context: CGContext = UIGraphicsGetCurrentContext() else { return nil }

        context.setFillColor(backgroundColor.cgColor)
        context.setAlpha(1.0)
        context.fill(drawingRect)
        context.setLineWidth(0.25)

        let max: CGFloat = CGFloat(samples.max() ?? 0)
        let heightNormalizationFactor = imageSize.height / max / 2
        let widthNormalizationFactor = imageSize.width / CGFloat(samples.count)
        for index in 0 ..< samples.count {
            let pixel = CGFloat(samples[index]) * heightNormalizationFactor

            let x = CGFloat(index) * widthNormalizationFactor

            context.move(to: CGPoint(x: x, y: middleY - pixel))
            context.addLine(to: CGPoint(x: x, y: middleY + pixel))

            context.setStrokeColor(strokeColor.cgColor)
            context.strokePath()
        }
        guard let soundWaveImage = UIGraphicsGetImageFromCurrentImageContext() else { return nil }

        UIGraphicsEndImageContext()
        return soundWaveImage
    }

    func generateWaveImage(from audioUrl: URL, in imageSize: CGSize) -> UIImage? {
        let samples = readBuffer(audioUrl)
        let img = generateWaveImage(samples, imageSize, UIColor.blue, UIColor.white)
        return img
    }
}

Usage

let url = Bundle.main.url(forResource: "TEST1.mp3", withExtension: "")!
let img = waveGenerator.generateWaveImage(from: url, in: CGSize(width: 600, height: 200))
Learner
  • 621
  • 6
  • 9