It would be really nice if we could “cast” the app onto another device (e.g. Google Home), similar to what you can do with the apps for YouTube, Netflix etc.
This would be especially nice for the songs part of the app, so we can get the display up on a big screen, and audio coming out some decent speakers instead of the squeaky little ones on the phone/tablet. Also would help prevent me from dropping my phone off my music stand like 3 times a day.
Just tried it through casting from Google Home to my TV and then letting the app run in landscape mode and it works ok.
It’s not very optimised though and getting a fair bit of latency as it’s not loading straight on to the Chromecast like YT or Netflix.
YMMV I guess
I don’t believe it will. The latency is caused by beaming the data from one device to another, which is always going to be “slower” than zapping it down a cable. It’s why bluetooth headsets/speakers are not particularly good for the JG app either, because they add latency.
Native support or not, the data is still going to be transmitted the same way.
Not necessarily, latency will always be an issue/consideration of course, but there are ways latency can be reduced. I’m not a mobile app developer but when developing in general you have access to APIs you can leverage that help with these kinds issues. This is the types of optimization that’s looked into when creating native support.
Again, not a mobile app dev so don’t know if they already are doing this or if android/IOS offer something like that, but my point is only that it’s not always so cut and dry. It’s like when developing video games you don’t have to leverage directX or open source APIs like Vulkan on windows but if you do develop native support you gain access to more options for optimization.
I was watching Premier League games via phone on TV using Smartview casting between my Galaxy phone and Samsung TV and no latency there. I reckon if you use app you might get some when you are clicking on the app (doubt it will be much) but surely when you play along latency doesn’t matter as you are only looking at tv screen?
APIs are still limited by the physical limits. If it takes 0.5ms (for example) for data to physically travel through the air from point A to B, then that’s how long it takes. That’s why cables are generally faster than wireless, because data travels quicker along a wire or fibre cable than it does through air. It’s also why shorter cables are faster than longer cables.
The time can be extended by going through unnecessary stages and by poorly developed software, but the physical limit can’t be reduced. Simply adding “native” support for such a feature is not really going to do much. I’d bet that the people who developed the casting features of Apple and Android phones know more about how to efficiently transmit data than the good people at Musopia do.
But I digress… this probably isn’t even your issue anyway. As someone pointed out, if you’re watching and listening to the app on a single device, then latency shouldn’t be a concern. Disconnect between what you’re seeing and hearing is something else. (Sounds exactly like the issue I have when using a FireStick to use VLC to watch movies from my PC. I have to adjust the sound delay to get the audio in sync with the picture, every time.)
You’d be surprised, besides a lot of these things can be offered but not used. Going back to the video game example, Microsoft knows much more about optimizing for the windows platform than the average developer which is why they offer DirectX. Apple and IOS general streaming tech is just that “general” and may not be best for every situation. Which is why companies offer things like APIs so developers can use what they need for what’s best for what they are doing. That’s why native support may (or may not depending on what’s available/what they are already leveraging) help with this.
I think there is a big discrepancy in what you guys are talking about. If you use Roku or Chromecast you guys are using sort of third party adapters between your devices and TVs. Many modern TVs have in built Miracast device which means apps, so using proxy like Chromecast or Roku (which btw is entirely possible that lattency happens as you introduce an app your device needs to connect through) is not necessarily. Most Android phone have Smartview option so you don’t need to use any 3rd parties. I just checked my phone and latency doesn’t exist everything works like a charm with Justin’s app and unless you have unstable Wifi at your place you should be ready to rock.
In my Samsung you swipe down your top menu and you search for this icon, then you check TV off your list and you connect with your TV. Easy
Just to be clear, this (Smartview) is a Samsung specific thing which only works on Samsung phones/tablets to a Samsung TV or other Samsung streaming device.
The standard casting system for Android is Chromecast.
As far as latency is concerned, for a playback app latency should not matter, as long as the audio and video are synchronised.
Latency only matters for two-way applications (like recording whilst monitoring in software) or if you are trying to sync two separate things (such as the phone speaker with the TV speaker or activating buttons on the phone screen to the TV screen as in a game).
I think other Android phones have similar casting software that is not necessarily Chromecast? Maybe it’s not a Smartview per say but it is also not a Chromecast. Never the less we agree on one thing, latency should not be an issue
I haven’t tried casting, but the discussion of latency in this thread sounds similar to what I have recently experienced.
I have a Fender Mustang Micro plugged into my electric guitar, and headphones plugged into that by wire for guitar sound. That works great – no noticeable lag.
The Micro, in turn, can connect by bluetooth to a device on which I run the JG songs app, so I can watch and listen to that at the same time I’m hearing my guitar. The only problem is that if I’m watching the visual indicator as a cue for when each bar starts, that lags a fraction of a second, just enough to throw me off. I compensate by looking at the cue for “next chord” but otherwise trying to hear the changes instead of seeing them.
Relying on sound more than visual cues is good practice, but the visual cues are intended to be synchronized, I’m sure. I assume the lag happens because of the latency introduced by the Micro relaying the device’s audio through a bluetooth connection. If I plugged my headphones directly into the device, I wouldn’t be plugged into my guitar anymore.
It’s impossible to remove all latency, I know, but I wonder if the issue could be addressed by allowing the user to set a latency value that would delay the video by X milliseconds to bring it into sync. The perfect offset wouldn’t be the same value for every possible setup, so it would need to be user configurable, not a “hard-wired” preset.
I have no idea how hard that is, but I’ve seen latency handled that way in another case, which is in the VR game of “Beat Saber”. Syncronization is very important in that game. In one of the settings screens, there’s a dot that bounces left and right, with sounds that are supposed to sync up with hitting either extreme, and one at the midway point. The user adjusts the latency with a slider until they think the bleeps are blipping at the right time, and then saves that as the latency adjustment which gets applied to the game. So the devs didn’t eliminate latency (impossible), but they added a way to introduce a small delay to the visuals to get everything back in sync.