I have a mobile app (in iOS and Android) and a trained identifier. I am looking for a specialist/s who can create a landmark map using this identifier and then take existing animations (which we already have) and do the compositing, smart resize and image processing tasks to create filters similar to Snapchat over photos and short video clips. Pixlab do something very similar to give you an idea of the coding involved.
I can go into a lot more detail if you are interested in this project, so please get in touch if you are.
We are migrating from Woocomerce and require mirroring of our site aesthetic and functionality on Shopify using the Bold Checkout App. See attached screenshot for the current aesthetic to be duplicated on Shopify.
The functionality is as follows:
- Customers can select multiple variants simultaneously: ordering multiple flavours at the same time
- Discounts are automatically applied depending on the quantity and whether one-off purchase or subscription
- Minimum order for subscription is x2
- Free gift added when x 2 products added in the bag
- Customers can change flavours/quantity and frequency on their subscription page
Aesthetic and UI needs to be mirrored in the image.
Looking for highly experienced Liquid developer specialised in Subscription App customisation with excellent knowledge and experience of payment gateways and full understanding of integrating APIs with the platform and gateway.
For complete details, see my blog post on Steemit.com:
I'm looking to hire someone to put a game-like GUI on my Max/MSP project using OpenGL.
There has been much deliberation, but the consensus seems to be to create this game interface as a separate app, which will receive communication from the main program via OSC. This will allow for flexibility in choosing which framework/language to write it in, as well as provide a path for eventually building it into a full standalone video game. It's possible the best framework for the job would be Cinder or even Unity, but openFrameworks, Nannou, JUCE or any other would also be options. The possibility of doing it natively in Jitter also still exists.
This project is destined to be a rhythm game not unlike Guitar Hero, except that you write the music you're playing to as you go. Below is a video of the system in action as it currently stands. I start playing and when it likes what it hears it starts building a song around me in real-time. If it's a little hard to differentiate what's what audibly, that's where this new visual interface will help.
There are essentially 3 things this first iteration needs to do:
1) Like Guitar Hero, "targets" of some kind need to appear and disappear on the screen, denoting moments in the future to aim for when playing a note.
2) Targets need to enter into "scoring animations" when users hit a note at the right time.
3) Along with each scoring animation, a numeric score (e.g. +500) will need to pop up from the target and then disappear. A cumulative total of the scores will need to be kept track of in a permanent scoreboard off to the side, along with a few other metrics.
I'm looking to build a news app, that reads from my website, with a daily reminder for an update.
App must to have the capability to share to Facebook, create pop-up notifications. It will also require account setup, and subscription-based all in "KaiOs" and the ability for a user to type comments back to us.