My car does most of the driving. Just gotta set the cruise speed and it keeps up with traffic and stays in the lane. It does a better job than 80% of the drivers out there.
on a stretch of memorial where it goes from 70 to 50km?! hahahaha
hard to believe but you are an even worse driver than poster! hahahaha
You must be the 1st idiot in Calgary to set cruise control on Memorial! LOL
__________________ Peter12 "I'm no Trump fan but he is smarter than most if not everyone in this thread. ”
Send it to 311 as a safety issue and request a No U-Turn sign and enforcement.
Enforcement? I work near at a traffic light intersection that has see's a U-turn nearly every 10 minutes during the day. Never, ever seen any enforcement.
Can they just fata off with the HEIC format of photos? I get texted photos of riser sites and they come in this format that no one uses or can open. Now I have to phone IT and get some crap installed to convert them to something I can save and send out. Is there a better way to deal with them?
The Following User Says Thank You to fotze2 For This Useful Post:
Can they just fata off with the HEIC format of photos? I get texted photos of riser sites and they come in this format that no one uses or can open. Now I have to phone IT and get some crap installed to convert them to something I can save and send out. Is there a better way to deal with them?
Ban Apple.
The Following 2 Users Say Thank You to Fuzz For This Useful Post:
Random question, are the Rayban Meta glasses actually kinda neat and useful? Or are they just a gimmick in their current form?
They have some pretty neat features but nothing I'd say is groundbreaking. If someone made a pair of wireless earbuds that had a wide-angle portrait-oriented camera on one side of them, that's basically what they are, the sunglasses just allow fitting the tech into a form-factor that doesn't scream "look at this cyberpunk weirdo".
Spoiler!
The microphones are pretty good, and the video quality is respectable for such a small camera. The battery life is pretty mediocre, even just listening to music. It's somewhat shocking how bad it is despite not using any of the other features. Video recording is pre-set to 3 minutes long, or can be reduced to 1 minute. You can end the recording early, but you can't extend the recording duration, you have to start recording another clip.
They have a 'conversation focus' feature that's in early access, which is something my LG Tone Free FP9s do without needing a camera. Meta's implementation uses the camera and mic array to identify who you're facing and focus on that specific speaker, meaning if you turn away for a moment, that voice will be blended out with the background noise until you face forward again, which seems to be a bit of a miss IMO. If there's any intelligence at play, it should also recognize that the voice I've been facing is still the one I want to amplify while it's still speaking.
There's also a 'live translation' feature when using the app. It captures the speaker and translates it in the app and speaks it back to you using the glasses' built-in speakers.
The problem is that you need to know what language the other person is speaking and set it up first in the app, as the language pairing pack will then be downloaded to the glasses, which is a slow and clunky experience. You aren't going into a room full of people from around the world and seamlessly understanding six different foreign nationals speaking to you in their native tongues. The list of supported languages is very limited (six at time of writing, English, French, German, Italian, Portuguese, and Spanish), and only one language pair can be stored on the glasses at a time. So you have to pick English as your language, and French as the other, and then the app will download that translation pair to the glasses. If someone speaks Italian, you have to go back into the app, select Italian to English, and download the language pack to the glasses again.
This seems to be a really unintelligent implementation of speech-to-text translation when other translation apps allow you to download all the language packs to your phone so they're ready to use at a moment's notice. I don't see the benefit of loading the language pack on the glasses when the phone still appears to be doing the work anyway.
For something they've shoved into the 'Meta AI' app, I really don't see where the 'AI' comes into play. They could have had the live translation feature capture the audio in real-time, feed it into an AI, have the AI perform the translation, and speak it back in English using the onboard speakers, and you wouldn't need to tell the AI the source language -- it would just figure it out. But what the hell do I know.
On that note, while media import can transfer seamlessly using Bluetooth, downloading language packs to the glasses require that the app connect your phone directly to the glasses' own WiFi network. Same with updates.
So if you want the ability to go walk around while listening to music without having your ears completely closed off so you can be aware of your surroundings, and occasionally snap a photo or video of your surroundings, or a way of recording driving clips, they work great. Instagrammer / blogger-type hardware for sure. As some sort of entry-point into AI-enabled wearables, it isn't that at all.
Pretty gimmicky.
Quote:
Originally Posted by Wormius
I’d say they’re a god send to creeps.
They've made changes since they were released to prevent that, thankfully.
The signature Wayfarer ingots on either side of the frame are replaced by a camera lens and a recording notification LED. Initially they made this LED glow solid but it wasn't obvious enough, so they've updated it now to pulse to draw more attention to the fact that recording is occurring. The problem is they used a white LED and not a red LED, so it's still pretty easy to miss.
The glasses also compare the light detected by the camera and by the LED to detect if you're trying to cover the recording LED, and will prevent you from using the camera if it detects you're trying to cover it or have blocked it, which is good.
Quote:
Originally Posted by Fuzz
They are useful at helping others to identify you as the person to avoid contact with.
Unless you're looking for it, most people won't even clock them as being anything other than regular Wayfarers.
__________________
-James
GO FLAMES GO.
Quote:
Originally Posted by Azure
Typical dumb take.
Last edited by TorqueDog; 01-21-2026 at 12:36 PM.
The Following User Says Thank You to TorqueDog For This Useful Post:
Of course they did... what don't those guys ruin...
Quote:
Originally Posted by Fuzz
They are useful at helping others to identify you as the person to avoid contact with.
I was thinking they might be useful when traveling with the live translation and ability to essentially "Google lens" things I have no clue about.
Quote:
Originally Posted by TorqueDog
They have some pretty neat features but nothing I'd say is groundbreaking. If someone made a pair of wireless earbuds that had a wide-angle portrait-oriented camera on one side of them, that's basically what they are, the sunglasses just allow fitting the tech into a form-factor that doesn't scream "look at this cyberpunk weirdo".
Spoiler!
The microphones are pretty good, and the video quality is respectable for such a small camera. The battery life is pretty mediocre, even just listening to music. It's somewhat shocking how bad it is despite not using any of the other features. Video recording is pre-set to 3 minutes long, or can be reduced to 1 minute. You can end the recording early, but you can't extend the recording duration, you have to start recording another clip.
They have a 'conversation focus' feature that's in early access, which is something my LG Tone Free FP9s do without needing a camera. Meta's implementation uses the camera and mic array to identify who you're facing and focus on that specific speaker, meaning if you turn away for a moment, that voice will be blended out with the background noise until you face forward again, which seems to be a bit of a miss IMO. If there's any intelligence at play, it should also recognize that the voice I've been facing is still the one I want to amplify while it's still speaking.
There's also a 'live translation' feature when using the app. It captures the speaker and translates it in the app and speaks it back to you using the glasses' built-in speakers.
The problem is that you need to know what language the other person is speaking and set it up first in the app, as the language pairing pack will then be downloaded to the glasses, which is a slow and clunky experience. You aren't going into a room full of people from around the world and seamlessly understanding six different foreign nationals speaking to you in their native tongues. The list of supported languages is very limited (six at time of writing, English, French, German, Italian, Portuguese, and Spanish), and only one language pair can be stored on the glasses at a time. So you have to pick English as your language, and French as the other, and then the app will download that translation pair to the glasses. If someone speaks Italian, you have to go back into the app, select Italian to English, and download the language pack to the glasses again.
This seems to be a really unintelligent implementation of speech-to-text translation when other translation apps allow you to download all the language packs to your phone so they're ready to use at a moment's notice. I don't see the benefit of loading the language pack on the glasses when the phone still appears to be doing the work anyway.
For something they've shoved into the 'Meta AI' app, I really don't see where the 'AI' comes into play. They could have had the live translation feature capture the audio in real-time, feed it into an AI, have the AI perform the translation, and speak it back in English using the onboard speakers, and you wouldn't need to tell the AI the source language -- it would just figure it out. But what the hell do I know.
On that note, while media import can transfer seamlessly using Bluetooth, downloading language packs to the glasses require that the app connect your phone directly to the glasses' own WiFi network. Same with updates.
So if you want the ability to go walk around while listening to music without having your ears completely closed off so you can be aware of your surroundings, and occasionally snap a photo or video of your surroundings, or a way of recording driving clips, they work great. Instagrammer / blogger-type hardware for sure. As some sort of entry-point into AI-enabled wearables, it isn't that at all.
Pretty gimmicky.
They've made changes since they were released to prevent that, thankfully.
The signature Wayfarer ingots on either side of the frame are replaced by a camera lens and a recording notification LED. Initially they made this LED glow solid but it wasn't obvious enough, so they've updated it now to pulse to draw more attention to the fact that recording is occurring. The problem is they used a white LED and not a red LED, so it's still pretty easy to miss.
The glasses also compare the light detected by the camera and by the LED to detect if you're trying to cover the recording LED, and will prevent you from using the camera if it detects you're trying to cover it or have blocked it, which is good.
Unless you're looking for it, most people won't even clock them as being anything other than regular Wayfarers.
Nice. They seem like they're entering the realm of reasonable priced AI glasses with a limited current usage. I was thinking more whether these would be useful tools while traveling with the Even realities glasses the other ones I that seemed interesting. The other primary reason I was thinking about these glasses is to potentially utilize them to learn/practice languages. I used to cheap out on data while traveling which in hindsight now is stupid because it can hinder my ability to really enjoy my travels and not waste as much time getting lost. I was thinking the ability to further immerse myself on travels would be worth considering by using something like this.
I was thinking I might look into playing with a pair if they ended up ~$250 for a pair of Gen 2. Seems like something to keep tabs on, but not pursue yet.
I mean, yeah, they should be obvious when you're looking at the guy wearing them in a product marketing photo specifically designed to highlight the glasses as a main focal point.
Walking down the sidewalk on 17 Ave SW during a Saturday afternoon will yield different results. It's not like someone walking past you while wearing Dame Edna spectacles; the vast majority of people will just see relatively nondescript sunglasses.
I know a guy with a pair and i didn't notice until he told me that is what they were. I also tried them and I just didn't think that they could do anything I'd care about at this point. Now, if I could wear those to the golf course and they could read a green for me, and give me the Tiger Woods EA Sports putting line, I'd buy them today!
I know a guy with a pair and i didn't notice until he told me that is what they were. I also tried them and I just didn't think that they could do anything I'd care about at this point. Now, if I could wear those to the golf course and they could read a green for me, and give me the Tiger Woods EA Sports putting line, I'd buy them today!
Damn I should build that.
__________________
If you don't pass this sig to ten of your friends, you will become an Oilers fan.
The Following User Says Thank You to Shazam For This Useful Post: