Meta Platforms Inc., the owner of Facebook and Instagram, said on Tuesday 14th that it may cut 10,000 jobs soon.
The Tech giant has let go 11,000 employees almost 4 months ago in its 1st major layoff in its 18 year history, and is now 1st major IT giant to announce the 2nd round of layoffs. Mark Zuckerberg, Chief Executive Officer, said in a message “We expect to reduce our team size by around 10,000 people and to loose around 5,000 additional open roles that we haven’t hired yet.”
This is seen as part of a wider restructuring. The move underscores Meta’s push to turn 2023 into into the “Year of Efficiency” with promised cost cuts of $5 billion in expenses to between $89 billion and $95 billion. and the news had already sent the Meta’s shares up by 5-6%.
Meta is also looking to flatten the organization structure, may remove multiple layers of management and even ask managers to become individual contributors, while eliminating non-engineering roles, automating more functions and at least partially reversing a commitment to “remote-first” work that was made during COVID-19 pandemic lockdowns.
Meta is investing billions in building the futuristic metaverse, but has struggled and the reasons sighted were – due to high inflation, rising interest rates, geopolitical instability, increased regulation and reduce in advertising revenue post pandemic era.
Adobe Experience Manager or Adobe Experience Manager or AEM (formerly CQ5) is one of the best Enterprise-grade Web Content Management System. From a high level, it can be said to be so good because it combines building websites, mobile apps and forms along with online communities and it provides native feature of Digital marketing as being a part of Adobe marketing cloud (a set of solutions that integrate seamlessly, allowing users to deliver personalized, cross-channel and real-time marketing campaigns).
AEM is implemented as a Java web application, comes pre-configured with its own built-in servlet engine but can also be installed in any compatible third party application server, and translates any server which supports Java servlet API 3.1 or higher. It provides out of box functionality such as – component library, and set of application services, however majority of project requires custom development which will resolve into custom project code which extends out-of-the-box functionality and adopts it to the projects need.
The Architecture of AEM consists of three major layers :
Apache Sling Framework Sling by Apache (an open source project) is a RESTful application framework, which provides access to JCR repository via a restful API. It is used extensively for all the AEM authoring as all actions (dialogs) are made via REST calls to JCR via sling. Sling lives within OSGi as one of its bundles.
OSGi Application Runtime OSGi (Open Services Gateway Initiative) is a Java based framework for developing modern modular application stack, a dynamic library management implementation of Apache Felix in AEM. It is a set of specification which describes a modular system and a service platform for the Java programming language that implements a complete and dynamic component model (that doesn’t exist in standalone JVM environment). An application in an OSGi-based system is composed of an assemblage of components, called bundles, that can be dynamically installed, started, stopped or uninstalled even at runtime, without shutting down and restarting the entire application.
Java Content Repository (JCR) Java Content Repository (JCR) is a object database which supports both structure & unstructured database. It is a Java API and made up of node and properties to access content repository in a uniform manner. Adobe implementation of JCR is – CRX (Content Repository Extreme).
All these are running in Servlet Container or Java EE Application Server.
The infamous hacking unit in Russia’s GRU military intelligence agency known as Sandworm, is suspected to have been carrying out some of the worst cyberattacks in the history, for many years.
Now Wired has a profile of Colonel Evgenii Serebriakov, the GRU officer who’s running the Russian military intelligence service’s Sandworm unit.
According to intelligence sources, Serebriakov was put in charge of Sandworm in the spring of 2022 after serving as deputy commander of APT28, and currently holds the rank of colonel.
Serebriakov was prosecuted along with six other GRU agents, after being caught between a close-in cyber espionage in the Netherlands in 2018 targeted the Organization for the Prohibition of Chemical Weapons in The Hague. In that foiled operation, Dutch law enforcement didn’t just identify and arrest Serebriakov and his team, who were part of a different GRU unit generally known as Fancy Bear or APT28. They also seized Serebriakov’s backpack full of technical equipment, as well as his laptop and other hacking devices in his team’s rental car. As a result, Dutch and US investigators were able to piece together Serebriakov’s travels and past operations stretching back years. Christo Grozev, the principal Russia-focused investigator for the open-source intelligence agency Bellingcat said – “He can’t just be a regular hacker anymore. The fact that Serebriakov appears to have reached that position despite being previously identified and prosecuted during the failed Dutch campaign suggests that he must have been of considerable value to the GRU, that he seems too good to be dumping”.
Serebriakov’s new position as leader of the Sandworm, formally GRU’s Unit 74455, also known by the nicknames Voodoo Bear and Iridium — puts him in charge of a group of hackers who are probably practitioners of cyber warfare a lot. best of the world. (They have also been involved in espionage and disinformation campaigns).
Shares of First Republic Bank fell sharply in early trading this morning, which caused trades of the company to be paused due to volatility, implying investor discomfort with the financial institution despite government activity over the weekend to sort the Silicon Valley Bank crisis and potential cascading effects. The volatility comes just days after a stock market selloff that previewed SVB’s failure, as concern of contagion remains among analysts and the tech community more broadly.
As of the time of writing, equity shares of First Republic are off more than 65%, and trading has been halted as mentioned.
In an attempt to get ahead of investor concern, over the weekend First Republic announced that it had raised its financial position through “additional liquidity” raised from the Federal Reserve and JPMorgan Chase. Per the company’s statement on March 12th, it had “more than $70 billion” in unutilized liquidity “to fund operations.” Presumably that’s the capital standing against the company’s selloff, and a potential loss of investor confidence.
The question ahead of every startup and small business that lost faith in the stability of financial institutions over the past week is straightforward: where’s a safe place to park my money? First Republic Bank is one of those options since the death of SVB, which claimed in 2022 that it banked half of all US venture-backed startups. One on end, the stock drops might be seen as a concerning signal; on the other end, other regional banks also appear to be taking a trading hit – including Western Alliance and PacWest – as uncertainty links business actions for the foreseeable future.
The move comes despite the FDIC, the Department of the Treasury, and the Federal Reserve’s announcement that depositors at Silicon Valley Bank would be made whole on Sunday. Their actions precluded a potential crisis of thousands of businesses being unable to make payroll or operate as usual. However, despite the Federal Reserve also announcing that a plan to “make available additional funding to eligible depository institutions to help assure banks have the ability to meet the needs of all their depositors” through a “new Bank Term Funding Program [that will offer] loans of up to one year in length,” it appears that many public-market investors still want out of smaller banks.
At this juncture it’s worth considering what the morning markets would look like sans quick work by the government to staunch the bleeding.
Samsung and Google take very different approaches to smartphone photography. While Samsung has had a penchant for experimenting with hardware and software camera features, throwing in everything and the kitchen sink, Google’s approach has been a lot more deliberate, heavily leaning into computational photography to extract the best out of admittedly dated hardware.
This different strategy leads to some interesting results. Samsung remains one of the pack leaders for smartphone photography, simply because it isn’t afraid to push the limits on its top-tier flagships like the Galaxy S23 Ultra. All of that added hardware gets garnished with dollops of software features, which keeps consumers and reviewers alike entertained for one more release cycle. There’s definitely a lot of utility on offer, no doubt, but it’s no secret either that One UI offers more features than most people can remember, let alone regularly use in the two to four years of them owning a phone.
Hadlee Simons / Android Authority
Google, on the other hand, is very cautious and intentionally slow with what it adds to the Pixel photography experience. There’s a certain level of Apple-esque polish and calculated lethargy to what arrives on Google Camera. In the borrowed words of a competing OEM, it’s a burdenless experience. It’s difficult to take a bad photo on a Pixel, and if you do, Google offers a bouquet of Pixel-only software features to fix it. As a result, you don’t feel like you are constantly navigating menus to find that one thing that your phone camera could do; or worse, just sticking to the standard photo features and leaving everything else you paid for grossly underutilized.
If we may be allowed to indulge in some fantasy, there’s a middle ground here that looks rather enticing.
What if Google opened itself up to the idea of greater experimentation when it comes to the camera? What if you could take a lot of what Samsung is doing, give it that Google polish and thoughtfulness, and seat it on top of the seamless Google Camera experience on the Pixel? We’re fantasizing, but here are four awesome Samsung camera features that we would love to see on the upcoming Google Pixel 8.
Pro photo and video modes
Ryan Haines / Android Authority
This is perhaps one of the biggest cons of the Google Pixel camera experience: There’s simply no manual mode on the phone, neither for photos nor for video. Beyond a few basic controls for dual exposure (one for brightness/exposure, one for shadows/tone mapping), and color temperature, you cannot control any other parameter. If you’re planning to take a video, you lose the tone mapping setting too.
Google treats you like a kid under supervision: Play with these toys, and leave it to our algorithms to decide what’s best.
Google essentially treats you like a kid under supervision: Play with the toys in front of you, and leave it to our algorithms to decide what is best for you. The Pixel camera does let you output a RAW file in addition to the usual JPEG, but that still takes away from your control over the photo and video during the action.
Samsung, on the other hand, provides extensive control over the hardware that you paid so much money for. It trusts that those who use the manual mode know what they want out of a photo or a video. In fact, Samsung trusts you so much, it offers a dedicated camera app called Expert RAW which goes a step beyond the manual mode within the main camera app.
Within the main camera app on the Galaxy S23 series, you can adjust ISO, shutter speed, focus points, and color temperature for photos. For videos, you can adjust focus and shutter speed, letting you pull off tricks like rack focus.
Robert Triggs / Android Authority
If you go down the Expert RAW rabbit hole, you can do all of this with even further granularity, and output 16-bit RAW images that have a wider dynamic range and other benefits. There’s a histogram on display too.
Samsung’s manual mode provides an infinite ceiling for your creativity and growth. You can take the best photos your skill allows you to, and you can upskill yourself without needing to buy a dedicated camera. It tries to give you the best of both worlds: a guided photo experience for the average user, and an unlimited experience for the enthusiast. You can still stick entirely to the algorithms if you don’t have the time and patience to painstakingly craft each setting for the perfect shot. But if you have the vision for a shot, you can absolutely go for it.
Robert Triggs / Android Authority
Samsung’s manual camera mode approach suffers from the typical One UI feature overload, though. Why is there a Pro mode for photos, a Pro mode for video, and then an entirely separate Expert RAW app? There’s room for streamlining here. Maybe unbundle the Pro modes from the camera app and let it exist solely on the Expert RAW app? Perhaps provide all the granularity needed within the Pro modes themselves, instead of needing the separate app? There are different ways this can be improved, and this is where we feel Google could shine.
Manual mode provides an infinite ceiling for your creativity and growth.
We would love to see Google execute a streamlined manual mode, that does all that an enthusiast would want without being daunting and overbearing. We’ve been asking for a manual mode on the Pixel for a few years now, and it’s about damn time Google considers it seriously on the Pixel 8.
Single Take
Hadlee Simons / Android Authority
We just spent a lot of time pandering to our desire for painstakingly crafting each shot on the Pixel. Now let’s jump to the opposite end of the spectrum: taking many shots effortlessly.
Samsung Galaxy phones come with a camera feature called Single Take. In a nutshell, Single Take aims to simplify photography even further. It’s a very One UI-esque solution to the feature overload problem on Samsung phones. Got too many modes and creative ways to click a photo or take a video? Why not shoot in all of them with the single click of a button? That’s Single Take.
Just frame your shot, then click the shutter button, and watch as your phone takes up to 10 seconds to get you everything from a still photo to a boomerang video and everything else in between. You can get up to 10 different kinds of photos and four different kinds of videos with a single shutter click. It takes patience, as a Single Take shot can take anywhere between three and 10 seconds. But the end result is unmatched versatility.
Hadlee Simons / Android Authority
Where Single Take falters is actually overdoing the versatility and settling with mediocrity. While Samsung touts AI prowess in selecting the best moments and shots, the end result is a diverse set of results that don’t actually wow you in any way.
In my personal experience with Single Take, I’ve found myself just gravitating to the basic photo, as the results from the other modes did not feel tuned to the occasion. If I wanted a specific result like, for example, a sped-up timelapse-style video, I get better results when I am shooting in that specific mode only. I am more likely to optimize for that occasion in such a scenario by paying special attention to the angles and the lighting. Single Take is not a magic wand, after all, and it can only work with what your camera can see.
If there is one company that can make Single Take work like a magic wand, it’s Google.
We’d love to see what Google’s take on Single Take would be, putting all those computational photography skills to good use. For instance, Single Take as a feature could become the default shooting mode. So when the average user clicks a photo of their pet, the Pixel camera could perhaps additionally suggest a boomerang and a slow-mo video that they would like as well.
Single Take is not a magic wand, but maybe Google could make it one with computational photography?
Google could also merge the concept into Google Photos, decluttering the output field: No need to show 14 different outputs, just show a single memory that expands when selected to display the 14 other captures. This is similar to how Google Photos already handles Portrait mode and Motion photo — all outputs are saved but are not surfaced unless you look for them. Combine all of this with the other AI-based auto-editing that Google does, and maybe we’re onto a Google One premium feature in the making here.
“Sky guide” constellation overlay for astrophotography
There’s a nifty camera feature hidden within the Samsung Expert RAW app. You can use the app to pinpoint nearby stars and celestial bodies. Just open Expert RAW and tap on the constellation icon in the upper right corner to enable Sky guide. The app then overlays the constellation onto your viewfinder. Clicking a photo will take a very long exposure shot, as is usually done for astrophotography.
While astrophotography is admittedly a niche use case, what is rather strange about Samsung’s approach is adding this feature to the Expert RAW app instead of the stock camera app. As a result, most users will not be aware of it. You could use the app to learn about the star group you are looking at, but because the feature is so tucked away, you’d never discover it. Or even if you did, you’d never remember it enough to use it.
Google could look at including something similar in the Pixel camera experience. In fact, it could consider actively prompting users to take a look at celestial bodies through their camera viewfinders, by leveraging the power of Google Search and Assistant.
Rare planetary parade alongside some constellations? Send a Google Assistant notification to open Astrophotography mode!
Whenever a significant celestial event is taking place, Google could deliver a notification at the right time to the user to go out and witness the spectacle. There are definitely ways in which Google could integrate this feature and execute it better than Samsung.
Directors View
Directors View is a bit more of a niche tool, but one that comes in very handy for anyone serious about vlogging their day.
With Directors View on Samsung Galaxy phones, you can preview the output of the different camera lenses in the viewfinder, and easily transition between them during a video recording. You can also choose to enable the front-facing camera in this mode to simultaneously look at footage from all sensors. The only catch is that all the rear cameras aren’t simultaneously recording — the preview is just a cropped feed from the wide sensor — but your actual recording is through the respective lens.
David Imel / Android Authority
Director’s view
Samsung’s execution of Directors View is actually pretty good, and we can’t find any immediate faults with it. That being said, we’d still love to see what Google can do with this if it decides to implement something similar on the Pixels. Doing so will gain favor from social media vloggers, and it might just help Google win some small market share.
Which Samsung camera features do you want to see on the Pixel 8?
8 votes
Bonus: Some flex mode magic for the upcoming Pixel Fold
Ryan Haines / Android Authority
This isn’t on our wishlist for the Pixel 8 per se, but it’s definitely something we hope Google pays attention to.
The Samsung Camera app boasts Flex mode capabilities, letting the Galaxy Z Fold 4 and Galaxy Z Flip 4 get some creative uses out of the camera.
There are no such features on the Google Camera. But to be fair to Google, the company’s Pixel Fold foldable hasn’t been released yet. We just hope Google builds enough camera features for its own foldable to take advantage of. Otherwise, it will continue to play catch up to Samsung for another release cycle at least.
Google already pushes the limits on the Pixel camera hardware, but there’s still room for more. Are there any other camera features you’d like to see on the Pixel 8? Let us know in the comments below!
Like it or not, it’s that time of year again — time for daylight saving time. On Sunday, March 12, at 2:00 a.m. (local time), the majority of people in the United States will be “springing forward” and setting their clocks ahead by an hour. Along with losing an hour of sleep, it also poses an important question: will your phone automatically change for daylight saving time?
It’s a question that applies regardless of which phone you have. Whether you’re rocking an iPhone 14 Pro, a Samsung Galaxy S23 Ultra, or any other smartphone, it’s important to know whether or not you need to change it for the new time.
Andy Boxall/Digital Trends
In the past, when we didn’t have smart devices to do the work for us, we had to remember to change the clocks ourselves — or find out the hard way. Of course, with non-internet-connected wall clocks, oven clocks, and some car clocks, you’ll still have to do that work.
Most smartphone clocks will automatically adjust if your software is up to date. If you previously monkeyed with the settings and changed the date or time defaults, you may have to update your clock yourself to ensure it’s ready for daylight saving time.
On a Samsung phone, like the Samsung Galaxy S23, you’ll need to follow a slightly different path. Go to Settings,General Management,Date and Time, and make sure Automatic Date and Time is turned on.
Other daylight-saving phone tips
Andy Boxall/Digital Trends
Although the above steps should be all you need to worry about, there are some other things you can do to be extra sure your phone changes when it should for daylight saving time.
If you haven’t updated your phone in a while, it’s worth checking to make sure it’s running the latest available software. On an iPhone, open the Settings app, tap General, and then Software Update. If you have an Android phone, go to the Settings app, scroll down the page, and tap on Software update or System update (the wording will be slightly different depending on which Android phone you have).
Outdated software shouldn’t impact your phone automatically changing for daylight saving time, but it certainly doesn’t hurt to install an update if one is available.
If you have any alarms set on your phone, they’ll also automatically update to the new daylight saving time without any extra work required from you.