I’m back
for a minute or two 😉
--
It’s been some really long [relative] time since my previous post on Medium, but given that its content still applies (and that I’m still “cherry picking” good life — I force it to be so — everyday), I’m not really sorry, haha.
But I think it’s time to share a couple of things here, again; at least I’d rather do so before I forget about them and/or before I lose all my followers forever, anyway. 😀
Firstly — skip this if you’re not a developer — if you (or I) ever need to remote debug WebView2 on Windows, i.e. if you (or I) just want to use some automation tools through its corresponding APIs (again), remember to run the target app’s process in an environment with this variable set:
$env:WEBVIEW2_ADDITIONAL_BROWSER_ARGUMENTS="--remote-debugging-port=8081 --remote-allow-origins=*"
And now [drums], some initial impressions about the elephant in the… world. As you probably know, Apple has just announced a really interesting piece of technology at WWDC on June 5 — here’s their brief intro if you didn’t hear about this yet:
Despite the name drop — which, btw, is not the greatest IMO (and no, I’m not referring here to NameDrop, a really nice new way of sharing contact info between iPhones, revealed in the same keynote) — all Vision Pro’s features are thoroughly and very well thought, overall.
One example that I was really impressed of is EyeSight; although some say it’s creepy, I would argue that the device would have been even creepier without that technology considering the importance of eye contact in real life human interactions.
But there are, however, two possible big problems with the new headset and Apple gaining enough market share with it when it’d arrive (also IMHO):
- Will early adopters reach the required threshold in a timely manner, especially in regard to wearing a headset of this type in public (or even at home when their spouse and kids are around?), given its looks (generally good, I must admit…