Me and my partner are building an iOS app, within we want the possibility for livestreaming. We need a library/SDK that uses FFmpeg encoding, to encode video directly from the phones camera, and then be able to send it to an API. Actually the library/SDK we need, has to do exactly the same as the Wowza coGoder SDK.
Our requirements are, that the encoder should be written in Swift (alternatively in Objective-C), and that the encoder uses FFmpeg, to encode the video ON the phone, to H.264 codec, and the audio to AAC codec. Besides from that, we need the possibility to send this encoded video to a given API, with further informations about applicationname, streamname, host address and portnumber.
Also the library/SDK needs to be able to preview video on the senders phone, in a UIView or UIImageView, and let the user be able to switch between front and back-camera. It has to be possible to switch between different resolutions of the encoded video, so that its possible to choose (in code), if I want the users to send video in 640x480, 1280x720 etc., and also be able to configiure the bitrate of the audio (also in code)
We need a delegate, that checks for video-status, like, if the video is stopped, some action has to be called, if the video is paused, some other option has to be called etc.
The library/SDK has to be easy to import in our application, and has to have some convenient-methods, that makes it easy to use.
Please let me know if this sounds doable for you, or if you have any additional questions.
The price we added in this job offer, isn't necessarily the final price. We have no idea about how long this assignment will take, but if you’re the man/woman for the job, we can obviously discuss the price.