We all at sometime have played audio in our apps, correct?
But we may not have created a complete music player that will run even if the application is not running in the foreground, yes?
Watch Video Tutorial
So in this article we will see how we can easily create such a music player.
Our requirements are
- Should play audio continuously even if the application is not running
- I should be able to control the music from lock screen or with my headsets.
A normal audio player will stop if the application is not in foreground, correct?
So what do we do to keep it running all the time until the application is killed.
We have a solution
In Flutter, we have some handy library to achieve this functionality.
So Let’s start…
I am assuming you have a simple Flutter project ready.
If yes, then go ahead and open the ‘pubspec.yaml’ file.
Add Dependencies
Do to the dependencies section and add the below dependencies
http: ^0.12.1 audio_service: ^0.11.0 just_audio: ^0.2.1
Run ‘flutter packages get’ in terminal to install the packages if your IDE is not doing it automatically.
Here is the link to plugin.
Make sure you follow the read me and make appropriate changes in the AndroidManifest.xml and in the info.plist file.
Let’ create the AudioPlayer class first.
AudioPlayer
AudioPlayer is a separate isolate that extends the BackgroundAudioTask. That means it will be running in a separate thread/isolate other than the main isolate.
Let’s start by creating some media buttons that show up in the lock screen.
MediaControl playControl = MediaControl( androidIcon: 'drawable/ic_action_play_arrow', label: 'Play', action: MediaAction.play, ); MediaControl pauseControl = MediaControl( androidIcon: 'drawable/ic_action_pause', label: 'Pause', action: MediaAction.pause, ); MediaControl skipToNextControl = MediaControl( androidIcon: 'drawable/ic_action_skip_next', label: 'Next', action: MediaAction.skipToNext, ); MediaControl skipToPreviousControl = MediaControl( androidIcon: 'drawable/ic_action_skip_previous', label: 'Previous', action: MediaAction.skipToPrevious, ); MediaControl stopControl = MediaControl( androidIcon: 'drawable/ic_action_stop', label: 'Stop', action: MediaAction.stop, );
Make sure to add the corresponding images for android in the drawable folder of your Android project.
Here is the complete AudioPlayer class code.
class AudioPlayerTask extends BackgroundAudioTask { // var _queue = <MediaItem>[]; int _queueIndex = -1; AudioPlayer _audioPlayer = new AudioPlayer(); AudioProcessingState _skipState; bool _playing; bool get hasNext => _queueIndex + 1 < _queue.length; bool get hasPrevious => _queueIndex > 0; MediaItem get mediaItem => _queue[_queueIndex]; StreamSubscription<AudioPlaybackState> _playerStateSubscription; StreamSubscription<AudioPlaybackEvent> _eventSubscription; @override void onStart(Map<String, dynamic> params) { } @override void onPlay() { } @override void onPause() { } @override void onSkipToNext() async { } @override void onSkipToPrevious() { } void skip(int offset) async { } @override Future<void> onStop() async { } @override void onSeekTo(Duration position) { } @override void onClick(MediaButton button) { } @override Future<void> onFastForward() async { } @override Future<void> onRewind() async { } }
Let’s see what it does.
Here we have a _queue declared which is actually the audio data or the MediaItem Queue that we will play in the audio player.
_queueIndex - takes care of the index of the item that is playing _audioPlayer - Plays the audio. _skipState - state of the audio, like Audio is connecting or ready.. mediaItem - current media item to play.
First thing is to the set the Queue in the AudioPlayerTask.
AudioServiceBackground.setQueue(_queue);
When the UI calls AudioService.play, the corresponding onPlay() overridden function will be called and you can have your logic to play audio there. Same applies for all other overridden methods like, onPause, onSkipToNext etc.
So for example onPlay could look like this…
@override void onPlay() { if (_skipState == null) { _playing = true; _audioPlayer.play(); } }
Similarly you can implement for all other methods.
Now to start the AudioService.
await AudioService.start( backgroundTaskEntrypoint: _audioPlayerTaskEntrypoint, androidNotificationChannelName: 'Audio Player', androidNotificationColor: 0xFF2196f3, androidNotificationIcon: 'mipmap/ic_launcher', params: params, );
Here ‘_audioPlayerTaskEntrypoint’ is the entry point of the AudioPlayer.
Here if you notice there is an extra parameter called ‘params’. This is to send data to the AudioTask from Outside. Make sure it is of basic data types.
If you want to send a custom event, you can even do that.
await AudioService.customEvent('key','<DATA>')
If you want to get the custom data in the AudioPlayer, override the onCustomEvent method.
You can also send custom event to the UI like this.
AudioServiceBackground.sendCustomEvent('just played');
So, In the AudioService.dtart, ‘_audioPlayerTaskEntrypoint’ should be a top level function or a static function since it is a separate isolate. It is a simple function that looks like this.
void _audioPlayerTaskEntrypoint() async { AudioServiceBackground.run(() => AudioPlayerTask()); }
Listen to State changes in the UI
So how do we listen to the state changes in the Audio player in the UI.
AudioPlayer sends data to the UI using streams.
The AudioService package exposes a method called AudioServiceBackground.setState to send data to the UI.
For example the setState method in the AudioPlayer class may look like this.
Future<void> _setState({ AudioProcessingState processingState, Duration position, Duration bufferedPosition, }) async { print('SetState $processingState'); if (position == null) { position = _audioPlayer.playbackEvent.position; } await AudioServiceBackground.setState( controls: getControls(), systemActions: [MediaAction.seekTo], processingState: processingState ?? AudioServiceBackground.state.processingState, playing: _playing, position: position, bufferedPosition: bufferedPosition ?? position, speed: _audioPlayer.speed, ); }
and to get it in the UI to update our display, we need to have a StreamBuilder in the UI.
The AudioPlayer streams out different types of data, for eg
- List of media item — AudioService.queueStream
- Current playing media item — AudioService.currentMediaItemStream
- Playback state — AudioService.playbackStateStream
Instead of creating three different stream builders in the UI, we can combine all three into one using RX dart.
Let’s create a simple class to hold the 3 streams data, I am naming it AudioState.
class AudioState { final List<MediaItem> queue; final MediaItem mediaItem; final PlaybackState playbackState; AudioState(this.queue, this.mediaItem, this.playbackState); }
and a method to create the stream.
Stream<AudioState> get _audioStateStream { return Rx.combineLatest3<List<MediaItem>, MediaItem, PlaybackState, AudioState>( AudioService.queueStream, AudioService.currentMediaItemStream, AudioService.playbackStateStream, (queue, mediaItem, playbackState) => AudioState( queue, mediaItem, playbackState, ), ); }
In the UI
@override Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: Text('Audio Player'), ), body: Container( padding: EdgeInsets.all(20.0), color: Colors.white, child: StreamBuilder<AudioState>( stream: _audioStateStream, builder: (context, snapshot) { final audioState = snapshot.data; final queue = audioState?.queue; final mediaItem = audioState?.mediaItem; final playbackState = audioState?.playbackState; final processingState = playbackState?.processingState ?? AudioProcessingState.none; final playing = playbackState?.playing ?? false;
Here we have all the data to be processed in the UI.
Here is how the complete AudioPlayer.dart looks like.
import 'dart:async'; import 'package:audio_service/audio_service.dart'; import 'package:just_audio/just_audio.dart'; MediaControl playControl = MediaControl( androidIcon: 'drawable/ic_action_play_arrow', label: 'Play', action: MediaAction.play, ); MediaControl pauseControl = MediaControl( androidIcon: 'drawable/ic_action_pause', label: 'Pause', action: MediaAction.pause, ); MediaControl skipToNextControl = MediaControl( androidIcon: 'drawable/ic_action_skip_next', label: 'Next', action: MediaAction.skipToNext, ); MediaControl skipToPreviousControl = MediaControl( androidIcon: 'drawable/ic_action_skip_previous', label: 'Previous', action: MediaAction.skipToPrevious, ); MediaControl stopControl = MediaControl( androidIcon: 'drawable/ic_action_stop', label: 'Stop', action: MediaAction.stop, ); class AudioPlayerTask extends BackgroundAudioTask { // var _queue = <MediaItem>[]; int _queueIndex = -1; AudioPlayer _audioPlayer = new AudioPlayer(); AudioProcessingState _skipState; bool _playing; bool get hasNext => _queueIndex + 1 < _queue.length; bool get hasPrevious => _queueIndex > 0; MediaItem get mediaItem => _queue[_queueIndex]; StreamSubscription<AudioPlaybackState> _playerStateSubscription; StreamSubscription<AudioPlaybackEvent> _eventSubscription; @override void onStart(Map<String, dynamic> params) { _queue.clear(); List mediaItems = params['data']; for (int i = 0; i < mediaItems.length; i++) { MediaItem mediaItem = MediaItem.fromJson(mediaItems[i]); _queue.add(mediaItem); } _playerStateSubscription = _audioPlayer.playbackStateStream .where((state) => state == AudioPlaybackState.completed) .listen((state) { _handlePlaybackCompleted(); }); _eventSubscription = _audioPlayer.playbackEventStream.listen((event) { final bufferingState = event.buffering ? AudioProcessingState.buffering : null; switch (event.state) { case AudioPlaybackState.paused: _setState( processingState: bufferingState ?? AudioProcessingState.ready, position: event.position); break; case AudioPlaybackState.playing: _setState( processingState: bufferingState ?? AudioProcessingState.ready, position: event.position); break; case AudioPlaybackState.connecting: _setState( processingState: _skipState ?? AudioProcessingState.connecting, position: event.position); break; default: } }); AudioServiceBackground.setQueue(_queue); onSkipToNext(); } @override void onPlay() { if (_skipState == null) { _playing = true; _audioPlayer.play(); } } @override void onPause() { _playing = false; _audioPlayer.pause(); } @override void onSkipToNext() async { skip(1); } @override void onSkipToPrevious() { skip(-1); } void skip(int offset) async { int newPos = _queueIndex + offset; if (!(newPos >= 0 && newPos < _queue.length)) { return; } if (null == _playing) { _playing = true; } else if (_playing) { await _audioPlayer.stop(); } _queueIndex = newPos; _skipState = offset > 0 ? AudioProcessingState.skippingToNext : AudioProcessingState.skippingToPrevious; AudioServiceBackground.setMediaItem(mediaItem); await _audioPlayer.setUrl(mediaItem.id); print(mediaItem.id); _skipState = null; if (_playing) { onPlay(); } else { _setState(processingState: AudioProcessingState.ready); } } @override Future<void> onStop() async { _playing = false; await _audioPlayer.stop(); await _audioPlayer.dispose(); _playerStateSubscription.cancel(); _eventSubscription.cancel(); return await super.onStop(); } @override void onSeekTo(Duration position) { _audioPlayer.seek(position); } @override void onClick(MediaButton button) { playPause(); } @override Future<void> onFastForward() async { await _seekRelative(fastForwardInterval); } @override Future<void> onRewind() async { await _seekRelative(rewindInterval); } Future<void> _seekRelative(Duration offset) async { var newPosition = _audioPlayer.playbackEvent.position + offset; if (newPosition < Duration.zero) { newPosition = Duration.zero; } if (newPosition > mediaItem.duration) { newPosition = mediaItem.duration; } await _audioPlayer.seek(_audioPlayer.playbackEvent.position + offset); } _handlePlaybackCompleted() { if (hasNext) { onSkipToNext(); } else { onStop(); } } void playPause() { if (AudioServiceBackground.state.playing) onPause(); else onPlay(); } Future<void> _setState({ AudioProcessingState processingState, Duration position, Duration bufferedPosition, }) async { print('SetState $processingState'); if (position == null) { position = _audioPlayer.playbackEvent.position; } await AudioServiceBackground.setState( controls: getControls(), systemActions: [MediaAction.seekTo], processingState: processingState ?? AudioServiceBackground.state.processingState, playing: _playing, position: position, bufferedPosition: bufferedPosition ?? position, speed: _audioPlayer.speed, ); } List<MediaControl> getControls() { if (_playing) { return [ skipToPreviousControl, pauseControl, stopControl, skipToNextControl ]; } else { return [ skipToPreviousControl, playControl, stopControl, skipToNextControl ]; } } } class AudioState { final List<MediaItem> queue; final MediaItem mediaItem; final PlaybackState playbackState; AudioState(this.queue, this.mediaItem, this.playbackState); }
and our Main UI will be like this.
import 'dart:math'; import 'package:audio_service/audio_service.dart'; import 'package:flutter/material.dart'; import 'package:rxdart/rxdart.dart'; import 'AudioPlayerTask.dart'; class BGAudioPlayerScreen extends StatefulWidget { @override _BGAudioPlayerScreenState createState() => _BGAudioPlayerScreenState(); } class _BGAudioPlayerScreenState extends State<BGAudioPlayerScreen> { final BehaviorSubject<double> _dragPositionSubject = BehaviorSubject.seeded(null); final _queue = <MediaItem>[ MediaItem( id: "https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3", album: "Science Friday", title: "A Salute To Head-Scratching Science", artist: "Science Friday and WNYC Studios", duration: Duration(milliseconds: 5739820), artUri: "https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg", ), MediaItem( id: "https://s3.amazonaws.com/scifri-segments/scifri201711241.mp3", album: "Science Friday", title: "From Cat Rheology To Operatic Incompetence", artist: "Science Friday and WNYC Studios", duration: Duration(milliseconds: 2856950), artUri: "https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg", ), ]; bool _loading; @override void initState() { super.initState(); _loading = false; } @override Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: Text('Audio Player'), ), body: Container( padding: EdgeInsets.all(20.0), color: Colors.white, child: StreamBuilder<AudioState>( stream: _audioStateStream, builder: (context, snapshot) { final audioState = snapshot.data; final queue = audioState?.queue; final mediaItem = audioState?.mediaItem; final playbackState = audioState?.playbackState; final processingState = playbackState?.processingState ?? AudioProcessingState.none; final playing = playbackState?.playing ?? false; return Container( width: MediaQuery.of(context).size.width, child: Column( mainAxisAlignment: MainAxisAlignment.center, crossAxisAlignment: CrossAxisAlignment.center, mainAxisSize: MainAxisSize.max, children: [ if (processingState == AudioProcessingState.none) ...[ _startAudioPlayerBtn(), ] else ...[ //positionIndicator(mediaItem, playbackState), SizedBox(height: 20), if (mediaItem?.title != null) Text(mediaItem.title), SizedBox(height: 20), Row( mainAxisAlignment: MainAxisAlignment.center, children: [ !playing ? IconButton( icon: Icon(Icons.play_arrow), iconSize: 64.0, onPressed: AudioService.play, ) : IconButton( icon: Icon(Icons.pause), iconSize: 64.0, onPressed: AudioService.pause, ), // IconButton( // icon: Icon(Icons.stop), // iconSize: 64.0, // onPressed: AudioService.stop, // ), Row( mainAxisAlignment: MainAxisAlignment.center, children: [ IconButton( icon: Icon(Icons.skip_previous), iconSize: 64, onPressed: () { if (mediaItem == queue.first) { return; } AudioService.skipToPrevious(); }, ), IconButton( icon: Icon(Icons.skip_next), iconSize: 64, onPressed: () { if (mediaItem == queue.last) { return; } AudioService.skipToNext(); }, ) ], ), ], ) ] ], ), ); }, ), ), ); } _startAudioPlayerBtn() { List<dynamic> list = List(); for (int i = 0; i < 2; i++) { var m = _queue[i].toJson(); list.add(m); } var params = {"data": list}; return MaterialButton( child: Text(_loading ? "Loading..." : 'Start Audio Player'), onPressed: () async { setState(() { _loading = true; }); await AudioService.start( backgroundTaskEntrypoint: _audioPlayerTaskEntrypoint, androidNotificationChannelName: 'Audio Player', androidNotificationColor: 0xFF2196f3, androidNotificationIcon: 'mipmap/ic_launcher', params: params, ); setState(() { _loading = false; }); }, ); } Widget positionIndicator(MediaItem mediaItem, PlaybackState state) { double seekPos; return StreamBuilder( stream: Rx.combineLatest2<double, double, double>( _dragPositionSubject.stream, Stream.periodic(Duration(milliseconds: 200)), (dragPosition, _) => dragPosition), builder: (context, snapshot) { double position = snapshot.data ?? state.currentPosition.inMilliseconds.toDouble(); double duration = mediaItem?.duration?.inMilliseconds?.toDouble(); return Column( children: [ if (duration != null) Slider( min: 0.0, max: duration, value: seekPos ?? max(0.0, min(position, duration)), onChanged: (value) { _dragPositionSubject.add(value); }, onChangeEnd: (value) { AudioService.seekTo(Duration(milliseconds: value.toInt())); // Due to a delay in platform channel communication, there is // a brief moment after releasing the Slider thumb before the // new position is broadcast from the platform side. This // hack is to hold onto seekPos until the next state update // comes through. // TODO: Improve this code. seekPos = value; _dragPositionSubject.add(null); }, ), Text("${state.currentPosition}"), ], ); }, ); } } Stream<AudioState> get _audioStateStream { return Rx.combineLatest3<List<MediaItem>, MediaItem, PlaybackState, AudioState>( AudioService.queueStream, AudioService.currentMediaItemStream, AudioService.playbackStateStream, (queue, mediaItem, playbackState) => AudioState( queue, mediaItem, playbackState, ), ); } void _audioPlayerTaskEntrypoint() async { AudioServiceBackground.run(() => AudioPlayerTask()); }
Here in the complete example of AudioPlayer.dart, you can see the data is sent from the main UI as params.
Complete Source Code