The FTC wants to ban the noncompete clauses ensnaring some tech workers

The regulator says they harm employees and innovation for little advantage.

The Federal Trade Commission is trying to ban noncompete clauses in employment contracts, which agencies installed vicinity to save you personnel from going to paintings for competition or leaving to begin comparable businesses. On Thursday, the regulator issued a note of proposed rulemaking, announcing that the clauses harm innovation and “lower opposition for employees,” which ends up in decrease wages overall.

While the proposed rule, which you may examine here, could advantage employees throughout industries, it`s particularly applicable in tech.

For years, we`ve visible huge tech agencies try to get employees at each stage to signal noncompete contracts — Kickstarter`s union fought in opposition to them at the same time as it became bargaining with the business enterprise, Acer sued a former CEO claiming he had violated one, Microsoft has imposed them on retiring executives, and Hideo Kojima became reportedly below one while leaving Konami. Amazon has even tried to restriction brief warehouse employees` task possibilities for as much as 18 months when they left (and made reaffirming the noncompete clause a situation for everlasting employees who desired to get severance while leaving the business enterprise).

Amazon really receives a point out in a reality sheet the FTC placed out along the proposal, with certainly considered one among its personnel getting used as an example. “Gene, a vice chairman at Amazon who had signed a non-compete, left the business enterprise to function head of product for a tech startup,” the regulator says. “Amazon sued to dam him from taking the task. After damaging media coverage, Amazon dropped the suit.”

The rule the FTC is presenting could require employers to drop any noncompete clauses of their present contracts, in addition to save you them from including new ones withinside the future. It could additionally follow to folks who are categorized as impartial contractors. That`s extra right information for the tech industry, which regularly is predicated on folks who aren`t conventional personnel. It additionally attempts to soothe agencies` fears through announcing they`d nevertheless have methods to make certain personnel didn`t bolt to competition with change secrets — as we`ve visible in numerous high-profile instances from Tesla and Apple, there are different legal guidelines round highbrow belongings that employers may want to use in that type of situation, no matter noncompete contracts.

While this invoice could nearly actually be useful to employees (such as the ones outdoor the tech industry), there are numerous hurdles earlier than it really is going into effect. For one, the FTC is in search of public remark and especially asking humans to weigh in on whether or not franchisees, high-stage executives, and low- or high-salary employees need to be handled otherwise below the rule.

It appears possibly that agencies will weigh in to try to guard their interests, and that ultimate clause can be particularly crucial for agencies like Amazon, which has been scrutinized for making use of noncompete clauses to each executives and warehouse employees.

Then the enterprise could be tasked with really passing something regulation that comes out of the general public remark process. Democrats presently have the bulk withinside the commission, that may assist this rule`s case for the reason that it has aid from chairperson Lina Khan.

Correction, January fifth 5:53PM ET: An in advance model of this tale said that the FTC became deadlocked with Republicans and Democrats. While that became the case withinside the past, it now has a Democrat majority.

The Keychron Q10 is a great mainstream Alice keyboard

Keychron continues doing it. Since we reviewed the Keychron Q2 in January 2022, it`s made over the Q1 and released 12 different Q-collection forums, from a everyday vintage great right all the way down to an ultracompact. There`s even an HHKB. But perhaps the maximum unusual is the Q10: a seventy five percentage Alice format mechanical keyboard with a milled aluminum chassis. Like different Keychron Q-collection keyboards, it`s a incredible keyboard for the price, with a gaggle of fanatic capabilities at middling-gaming-keyboard prices. Like them, it`s for a positive sort of person: a person who sees a $two hundred keyboard and says, “How is that this so reasonably-priced?!”

Imagine that a person cut up a keyboard down the middle, circled every 1/2 of slightly, kinked the outdoor columns lower back the opposite manner a bit, and glued it lower back together. That`s Alice — named for the TGR Alice, a 60 percentage keyboard from Malaysian fashion dressmaker Yutski that ran as a 40-unit institution purchase lower back in 2018 and stimulated a legion of clones, imitators, variants, and spinoffs.

Like different Alice forums, the Q10 isn’t pretty a cut up keyboard, and it`s now no longer pretty an ergonomic keyboard. You can`t manage the perspective or the camping nor function the halves independently. They aren`t a ways sufficient aside to honestly maintain your forearms parallel to every different, shoulder-width aside. And the Q10, in particular, is a bit tall. But it`s a bit greater cushty than a popular keyboard because it helps you to maintain your wrists at a greater impartial perspective in your forearms. I experience adore it opens up my shoulders a bit greater. It additionally seems cool.

THE GOOD
Interesting and beneficial format
Great experience and sound
Easy key remapping
South-going through hot-switch PCB
Left extent knob

THE BAD
Cheap-searching keycaps
You ought to need a 5-pound keyboard
$two hundred both too steeply-priced or suspiciously reasonably-priced

Fullmetal Alice

For $215 with keycaps and switches or $195 without, the Q10 is, trust it or now no longer, an absolute steal. The Q collection is Keychron`s try to make an off-the-shelf mechanical keyboard experience like a high-quit custom, and it generally works — in case your imaginative and prescient of a high-quit keyboard consists of terms like “gasket mount” and “milled aluminum chassis.”

My evaluation unit weighs 2244g, or simply below 5 pounds, with the inventory keycaps and switches. It`s supposed to move on a table and live there. Keychron is following the keyboard network here: maximum custom keyboards over the last decade were crafted from milled aluminum for some reasons. Aesthetically: metallic keyboards appearance nice, heavy matters experience high-quit, and that they don`t slide round your table whilst you kind. And practically, the per-unit fee of CNC-milled aluminum scales linearly, that is essential in case you`re most effective making 50 or a hundred of some thing for those who don`t thoughts paying loads of greenbacks every. It`s most effective withinside the beyond few years that fanatic keyboard producers are becoming the size important to make plastic instances, simply as greater mounted producers commenced making milled-aluminum ones.

Like the opposite Q-collection forums, it`s gasket-mounted: the transfer plate sits on strips of squishy foam among the pinnacle and backside frames. This offers the whole meeting a pleasant leap: in case you push difficult sufficient on any key, you may see all of the keys pass downward en masse and get better up. Small silicone bumpers among the pinnacle and backside frames save you metallic-on-metallic contact, similarly decreasing vibration and disposing of the high-pitched ping that stable-aluminum instances regularly have. There`s a layer of sound-damping foam among the transfer plate and PCB. The switches are gently lubed, and the stabilizers are… much less gently lubed.

These are all approaches fanatics mod their keyboards to present them deeper, fuller sounds and decrease high-pitched clacking or pinging. To positioned it any other manner: to make amends for the reality that they`re milled out of stable aluminum. Another is the tape mod (or Tempest mod, after the man who popularized it). It includes making use of layers of tape to the lower back of the PCB to extrade the sound profile. It`s reasonably-priced and easy, and it works. I`ve executed it to numerous keyboards. The Q10 comes pre-tape-modded with a skinny sheet of “acoustic tape” in lieu of the layer of acoustic foam different Q-collection forums have.

Does it work? Yeah.

With the inventory keycaps and Gateron Pro Red switches, the Q10 feels and sounds great. And I don`t even like mild linear switches. It`s now no longer quiet, necessarily, however maximum of the sound comes from the keycaps clicking towards the transfer plate. There`s no resonance or ping whatsoever. Even the gap bars — normally the loudest keys on any keyboard — are quite quiet, likely due to the fact they`re the dimensions of usual Shift keys. I for my part do now no longer kind with sufficient pressure to experience any leap from the gasket mount — it feels approximately similar to an incorporated plate to me, to be honest — however it appears to assist the sound profile, and it ain`t hurting anything.

The inventory screw-in PCB-mount stabilizers are okay. They`re generously however inexpertly lubed, and the backspace secret is louder than I`d like. If it had been my keyboard, they`re the primary matters I`d tweak. Still, through preinstalled stabilizer standards, they`re quite good.

Alice good

This is the primary time I`ve used an Alice board, and it took me nearly no attempt to get used to. It enables that the format is generally popular. Generally, the keys are the dimensions you`d assume them to be and approximately wherein you`d assume them to be.

The backside row is probably the trickiest adjustment: there are 3 1.25u modifier keys to the left of the primary area bar and a characteristic key to the proper of it. On the proper-hand side, there`s any other area bar, then a solitary 1u modifier that, through default, acts because the board`s characteristic key. If you’re used to counting on the ones proper-hand modifiers, you may ought to get creative. Fortunately, that`s all fixable: the Q10, like several of Keychron`s Q-collection forums, is absolutely programmable the use of VIA, a bendy and famous app withinside the keyboard network for customizing RGB lights and key mapping.

The Q10 consists of each Mac- and Windows-like minded keycaps withinside the box

Find the best AI-powered app to transcribe your audio

The famous Otter transcription carrier is converting its plans — however there are alternatives.

Whenever a famous on-line app declares a extrade to its fees, or withinside the offerings it offers for the ones fees, you`re going to get a response from its subscribers — specially the long-time period ones. One app that brought on this sort of dismay changed into Otter, a recording and transcription carrier that, in August 2022, introduced downgrades of the offerings it offers on of its plans and raised the fee at the month-to-month plan.

When a carrier you`ve used for some time makes radical adjustments in its charges and function set, a herbal response is to begin buying round to peer if there are any viable

There are forms of transcription offerings to be had on-line today: one which makes use of an AI engine and the alternative that makes use of human transcribers. The latter is generally lots greater correct however is likewise drastically greater high-priced. As a result, a whole lot of human beings use AI-pushed offerings to interpret and transcribe their audio, that is much less high-priced and generally reasonably, if now no longer perfectly, correct. So what we’ve got provided is a listing of AI-powered transcription offerings if you want to do not forget.

One issue to be conscious of: the pleasant of transcription provided through those apps can range broadly relying now no longer most effective at the AI engine the app is the usage of however additionally at the pleasant of your audio record. If there are a whole lot of voices speakme at once, if there is lots of history noise, if the audio system have accents strange to the AI — the ones can all serve to degrade the accuracy of the transcription. So a very good concept is to attempt out a transcription carrier with a regular record to peer how nicely it performs.

And do not forget which app can be maximum cost-powerful for you. If you most effective want to add an occasional record, it may be great to go together with both a loose model or one of the pay-as-you-move offerings. If you do ordinary uploads, then a month-to-month or annual subscription may go higher for you.

Let`s begin with the carrier that supplied the impetus for this article.

Otter

Otter gives a reasonably astounding variety of services, which include the cappotential to without difficulty document Zoom and Google Meet meetings, routinely create an define of your transcription or pull out highlighted phrases, and prepare your transcriptions into folders and your contacts into groups.

As mentioned, there had been quite a few modifications to the company`s expenses and features. For example, loose customers will now no longer have get right of entry to to all in their beyond transcriptions — best the final 25. Paying clients who’re on Otter`s Pro plan might be downgraded from a month-to-month allowance of 6,000 mins of transcribed audio to 1,2 hundred mins and from a most of 4 hours of audio in step with verbal exchange to ninety mins. (You can locate an FAQ approximately those modifications here.)

Otter attempted to ameliorate the ache to its paying clients — somewhat. While it has raised its month-to-month rate from $12.ninety nine to $16.ninety nine, its annual rate of $ninety nine.ninety six did now no longer change. Otter`s Business plan ($30 a month or $240 annually) nonetheless has the 6,000 mins in step with month / 4 hours in step with verbal exchange allowance, along side different features.

Temi

(video found on timi official website here)

Temi is a basic transcription service that offers such features as the ability to review and edit your transcriptions, slow down the replay, and export your files into text (Microsoft Word, PDF) or closed caption (SRT, VTT) files. Its mobile apps for Android and iOS allow you to record audio; you can then choose to transcribe it for a straightforward 25 cents per audio minute or upload your own recordings for the same price. New users get the first 45 minutes free.

Rev Max

Rev has been around for a while; until recently, it was mainly available for those who wanted human transcription services. The company has now introduced Rev Max, an AI transcription service that offers 20 hours of automated transcription services and unlimited Zoom transcripts for $29.99. (If you pass the 20-hour mark, you’ll be charged 25 cents a minute until your next month begins.) You also get a 5 percent discount on any human-based transcription services and no time limit on storage for your transcriptions. There is a 14-day free trial period, but you have to put in a credit card to get it.

 

 

 

 

 

 

ChatGPT proves AI is finally mainstream — and things are only going to get weirder

Researchers communicate approximately the `functionality overhang,` or hidden abilties and dangers, of synthetic intelligence. As the generation is going mainstream, we`re going to find out loads of recent matters approximately them.

A pal of mine texted me in advance this week to invite what I idea of ChatGPT. I wasn`t amazed he became curious. He is aware of I write approximately AI and is the type of man who maintains up with whatever`s trending online. We chatted a bit, and I requested him: “and what do you consider ChatGPT?” To which he replied: “Well, I wrote a half-first rate Excel macro with it this morning that stored me some hours at paintings” — and my jaw dropped.

For context: that is a person whose process entails a truthful little bit of futzing round with databases however who I wouldn`t describe as in particular tech-minded. He works in better education, studied English at university, and in no way officially discovered to code. But right here he became, now no longer handiest gambling round with an experimental AI chatbot however the use of it to do his process quicker after only some days` get right of entry to.

“I requested it a few questions, requested it a few extra, placed it into Excel, then did a few debugging,” is how he defined the process. “It wasn`t best however it became less complicated than Googling.”

Stories like this had been amassing this week just like the first spots of rain accumulating earlier than a downpour. Across social media, human beings had been sharing memories approximately the use of ChatGPT to jot down code, draft weblog posts, compose university essays, collect paintings reports, or even enhance their chat-up game (okay, that remaining one became without a doubt achieved as a joke, however the prospect of AI-augmented rizz continues to be tantalizing). As a reporter who covers this space, it`s been essentially not possible to preserve up with the whole thing that`s going on, however there’s one overarching fashion that`s caught out: AI is going mainstream, and we`re handiest simply starting to see the impact this can have at the international.

There`s a idea in AI that I`m in particular keen on that I suppose facilitates give an explanation for what`s going on. It`s called “functionality overhang” and refers back to the hidden capacities of AI: abilties and aptitudes latent inside structures that researchers haven`t even began to research yet. You would possibly have heard earlier than that AI fashions are “black boxes” — that they`re so big and complicated that we don`t absolutely recognize how they function or come to particular conclusions. This is widely actual and is what creates this overhang.

“Today`s fashions are a long way extra succesful than we suppose, and our strategies to be had for exploring [them] are very juvenile,” is how AI coverage professional Jack Clark defined the idea in a latest version of his newsletter. “What approximately all of the abilities we don`t recognize approximately due to the fact we haven`t idea to check for them?”

Capability overhang is a technical term, however it additionally flawlessly describes what`s going on proper now as AI enters the general public domain. For years, researchers had been on a tear, pumping out new fashions quicker than they may be commercialized. But in 2022, a glut of recent apps and applications have all at once made those abilties to be had to a widespread audience, and in 2023, as we keep scaling this new territory, matters will begin changing — fast.

The bottleneck has constantly been accessibility, as ChatGPT demonstrates. The bones of this application aren’t absolutely new (it`s primarily based totally on GPT-three.5, a big language version that became launched through OpenAI this 12 months however which itself is an improve to GPT-three, from 2020). OpenAI has formerly bought get right of entry to to GPT-three as an API, however the organization`s cappotential to enhance the version`s cappotential to speak in herbal speak after which put up it at the net for all people to play with delivered it to a far larger audience. And irrespective of how resourceful AI researchers are in probing a version`s abilties and weaknesses, they`ll in no way be capable of healthy the mass and chaotic intelligence of the net at big. All of a sudden, the overhang is accessible.

The identical dynamic also can be visible withinside the upward push of AI photograph mills. Again, those structures had been in improvement for years, however get right of entry to became confined in diverse ways. This 12 months, though, structures like Midjourney and Stable Diffusion allowed all people to apply the generation for free, and all at once AI artwork is everywhere. Much of that is because of Stable Diffusion, which gives an open-supply license for corporations to construct on. In reality, it`s an open mystery withinside the AI international that every time a organization launches a few new AI photograph feature, there`s a first rate danger it`s only a repackaged model of Stable Diffusion. This consists of the whole thing from viral “magic avatar” app Lensa to Canva`s AI text-to-photograph tool to MyHeritage`s “AI Time Machine.” It`s all of the identical tech underneath.

As the metaphor suggests, though, the possibility of a functionality overhang isn`t always accurate news. As nicely as hidden and rising abilities, there are hidden and rising threats. And those dangers, like our new abilties, are nearly too severa to name. How, for example, will schools adapt to the proliferation of AI-written essays? Will the innovative industries be decimated through the unfold of generative AI? Is gadget mastering going to create a tsunami of spam in order to damage the net forever? And what approximately the lack of ability of AI language fashions to distinguish reality from fiction or the verified biases of AI photograph mills that sexualize girls and those of color? Some of those issues are known; others are ignored, and still, extra are handiest simply starting to be noticed. As the pleasure of 2022 fizzles out, it`s positive that 2023 will incorporate a few impolite awakenings.

Welcome to the AI overhang. Hold on tight.

Top AI conference bans use of ChatGPT and AI language tools to write academic papers

AI tools can be used to ‘edit’ and ‘polish’ authors’ work, say the conference organizers, but text ‘produced entirely’ by AI is not allowed. This raises the question: where do you draw the line between editing and writing?

 

One of the world’s most prestigious machine learning conferences has banned authors from using AI tools like ChatGPT to write scientific papers, triggering a debate about the role of AI-generated text in academia.

The International Conference on Machine Learning (ICML) announced the policy earlier this week, stating, “Papers that include text generated from a large-scale language model (LLM) such as ChatGPT are prohibited unless the produced text is presented as a part of the paper’s experimental analysis.” The news sparked widespread discussion on social media, with AI academics and researchers both defending and criticizing the policy. The conference’s organizers responded by publishing a longer statement explaining their thinking. (The ICML responded to requests from The Verge for comment by directing us to this same statement.)

According to the ICML, the rise of publicly accessible AI language models like ChatGPT — a general purpose AI chatbot that launched on the web last November — represents an “exciting” development that nevertheless comes with “unanticipated consequences [and] unanswered questions.” The ICML says these include questions about who owns the output of such systems (they are trained on public data, which is usually collected without consent and sometimes regurgitate this information verbatim) and whether text and images generated by AI should be “considered novel or mere derivatives of existing work.”

The latter question connects to a tricky debate about authorship — that is, who “writes” an AI-generated text: the machine or its human controller? This is particularly important given that the ICML is only banning text “produced entirely” by AI. The conference’s organizers say they are not prohibiting the use of tools like ChatGPT “for editing or polishing author-written text” and note that many authors already used “semi-automated editing tools” like grammar-correcting software Grammarly for this purpose.

“It is certain that these questions, and many more, will be answered over time, as these large-scale generative models are more widely adopted. However, we do not yet have any clear answers to any of these questions,” write the conference’s organizers.

As a result, the ICML says its ban on AI-generated text will be reevaluated next year.

The questions the ICML is addressing may not be easily resolved, though. The availability of AI tools like ChatGPT is causing confusion for many organizations, some of which have responded with their own bans. Last year, coding Q&A site Stack Overflow banned users from submitting responses created with ChatGPT, while New York City’s Department of Education blocked access to the tool for anyone on its network just this week.

In each case, there are different fears about the harmful effects of AI-generated text. One of the most common is that the output of these systems is simply unreliable. These AI tools are vast autocomplete systems, trained to predict which word follows the next in any given sentence. As such, they have no hard-coded database of “facts” to draw on — just the ability to write plausible-sounding statements. This means they have a tendency to present false information as truth since whether a given sentence sounds plausible does not guarantee its factuality.

In the case of ICML’s ban on AI-generated text, another potential challenge is distinguishing between writing that has only been “polished” or “edited” by AI and that which has been “produced entirely” by these tools. At what point do a number of small AI-guided corrections constitute a larger rewrite? What if a user asks an AI tool to summarize their paper in a snappy abstract? Does this count as freshly generated text (because the text is new) or mere polishing (because it’s a summary of words the author did write)?

Before the ICML clarified the remit of its policy, many researchers worried that a potential ban on AI-generated text could also be harmful to those who don’t speak or write English as their first language. Professor Yoav Goldberg of the Bar-Ilan University in Israel told The Verge that a blanket ban on the use of AI writing tools would be an act of gatekeeping against these communities.

“There is a clear unconscious bias when evaluating papers in peer review to prefer more fluent ones, and this works in favor of native speakers,” says Goldberg. “By using tools like ChatGPT to help phrase their ideas, it seems that many non-native speakers believe they can ‘level the playing field’ around these issues.” Such tools may be able to help researchers save time, said Goldberg, as well as better communicate with their peers.

But AI writing tools are also qualitatively different from simpler software like Grammarly. Deb Raji, an AI research fellow at the Mozilla Foundation, told The Verge that it made sense for the ICML to introduce policy specifically aimed at these systems. Like Goldberg, she said she’d heard from non-native English speakers that such tools can be “incredibly useful” for drafting papers, and added that language models have the potential to make more drastic changes to text.

“I see LLMs as quite distinct from something like auto-correct or Grammarly, which are corrective and educational tools,” said Raji. “Although it can be used for this purpose, LLMs are not explicitly designed to adjust the structure and language of text that is already written — it has other more problematic capabilities as well, such as the generation of novel text and spam.”

Goldberg said that while he thought it was certainly possible for academics to generate papers entirely using AI, “there is very little incentive for them to actually do it.”

“At the end of the day the authors sign on the paper, and have a reputation to hold,” he said. “Even if the fake paper somehow goes through peer review, any incorrect statement will be associated with the author, and ‘stick’ with them for their entire careers.”

This point is particularly important given that there is no completely reliable way to detect AI-generated text. Even the ICML notes that foolproof detection is “difficult” and that the conference will not be proactively enforcing its ban by running submissions through detector software. Instead, it will only investigate submissions that have been flagged by other academics as suspect.

In other words: in response to the rise of disruptive and novel technology, the organizers are relying on traditional social mechanisms to enforce academic norms. AI may be used to polish, edit, or write text, but it will still be up to humans to assess its worth.

Google’s new split-screen look for Android Auto is rolling out to everyone

Now Android Auto lets you easily see the map and your music at the same time, and even messages or other alerts as they pop up.

The Android Auto look and feel has evolved greatly since we reviewed it in 2015, but now its biggest update is starting to roll out to all users, introducing a split-screen UI that can let you see more things at once. Keeping the map on screen while also adding one or two other panes makes it a bit more like Apple’s current approach to CarPlay, and Google says its focus is on creating a “more personal, easy-to-use experience from behind the wheel.”

Wherever the inspiration comes from, I appreciate it. Dubbed “Coolwalk” in testing over the last year or so, the new UI has been publicly available in beta form for several months after it was publicly announced in the spring, and as a longtime Android Auto user, I’ve had it in my car.

While Android Auto previously prioritized either the media player or navigation, shoving the other option to a status bar at the bottom, now you can skip through a podcast and still keep the Google Maps screen up and even have it in the main window.

I was worried that the additional information would be distracting, but after trying it, I think that isn’t the case. There’s usually a dual-pane setup visible — you can still have just one app showing if you prefer — and it didn’t suffer in my car for the shared screen space. It also makes Google Assistant more useful in-car, as messages slide in as a third segment in the corner, occasionally with one-touch options to send canned replies to texts or, if you’re heading on a trip, a way to send your ETA to a contact easily.

The launcher bar is still a part of Android Auto, putting some of your recently used apps on the bottom or side of the screen where they can be quickly pulled up, which is good if you don’t know whether this is a podcast or a music day.

One feature that’s still coming soon for recent Pixel and Samsung phones is the ability to make calls using WhatsApp from within Android Auto.

Google didn’t give details on the timing for the rollout but said CES 2023 attendees could check it out in a BMW i7 if they’re around the show floor this week.

The launcher bar is still a part of Android Auto, putting some of your recently used apps on the bottom or side of the screen where they can be quickly pulled up, which is good if you don’t know whether this is a podcast or a music day.

One feature that’s still coming soon for recent Pixel and Samsung phones is the ability to make calls using WhatsApp from within Android Auto.

Google didn’t give details on the timing for the rollout but said CES 2023 attendees could check it out in a BMW i7 if they’re around the show floor this week.

Data Engineering Best Practices to Drive Transformation

To succeed in this age of digital transformation, enterprises are embracing data-driven decision-making. And, making quality data available in a reliable manner proves to be a major determinant of success for data analytics initiatives. The data engineering teams straddle between building infrastructure, running jobs & fielding ad-hoc requests from the analytics and BI teams. And, this is where the data engineers are tasked to take into account a broader set of dependencies and requirements as they design and build their data pipelines.

But is there a way to structure it logically? Well, the answer is both yes and no. To start with you’ll need to understand the current state of affairs, including the decentralization of the modern data stack, the fragmentation of the data team, the rise of the cloud, and how all these factors have changed the role of the data engineering forever. And, how a proven framework with best data engineering practices can help tie the data pieces together to make decision-making seamless.

Through this article, based on our experience we’ll shed light on some of the data engineering best practices to enable you to work with data easier while delivering innovative solutions, faster.

Data Engineering Best Practices

The pointers listed below will help you build clean, usable, and reliable data pipelines, accelerate the pace of development, improve code maintenance, and make working with data easy. This will eventually enable you to prioritize actions and move your data analytics initiatives more quickly and efficiently.

  • Analysis of Source Data
  • ETL Tool Evaluation
  • Data Acquisition Strategy
  • Storage Capability – Centralized & Staging
  • Data Warehousing

Analysis of Source Data

Business data, whether it is qualitative or quantitative; can take different forms depending on how it is collected, created, and stored. You need the right tech stack, infrastructure, and processes in place to analyze it and generate accurate and reliable insights. Here’s a quick rundown on how to go about it;

  • Assess Data Needs & Business Goals: Gain a clear understanding of how you would approach big data analytics at the very outset. The type of data you will collect, where it will be stored, how it will be stored, and who will analyze it – everything needs to be planned.
  • Collect & Centralize Data: Once you have a clear understanding of your data needs, you need to extract all structured, semi-structured, and unstructured data from your vital business applications and systems. This data should then be transferred to a data lake or a data warehouse. This is where the ELT or ETL process will come into play.
  • Perform Data Modeling: For analysis, data needs to be centralized in a unified data store. But before transferring your business information to the warehouse, you may want to consider a data model. This process will help you determine how the information is related & how it flows together.
  • Interpret Insights: You can use different analytical methods to uncover practical insights from business information. You can analyze historical data, track key processes in real-time, monitor business performance and predict future outcomes.

ETL Tool Evaluation

ETL tools can efficiently move your data from the source to different target locations. They deliver the insights that your finance, customer service, sales, and marketing departments need to make smarter business decisions. But how do you choose the right tool? Listed below are some of the important criteria to evaluate an ETL tool as per your business need:

  • Pre-built Connectors and Integrations
  • Ease of Use
  • Pricing
  • Scalability and Performance
  • Customer Support
  • Security and Compliance
  • Whether you want to go for Batch Processing or Real-Time Processing
  • ETL or ELT

Data Acquisition Strategy

Data acquisition is an important process that deals with data discovery outside of the organization and to bring it within the system. The important aspect to consider here would be to glean the valuable insights you need from this information and how it will be used. And, that would require smart planning to ensure no time and resource gets wasted on the data that won’t be of use. Here are a few points based on our experience;

  • One-click Ingestion: Movement of all existing data to a target systemAll analytics systems and downstream reporting tools rely on a steady stream of accessible data. One-click ingestion allows you to ingest data in different formats into an existing table in the Azure Data Explorer and create mapping structures.
  • Incremental Ingestion: The incremental extract pattern allows you to extract only changed data from your source tables/views/queries, reducing the load on your source systems and overall ETL duration. To determine the incremental ingestion type that meets your need, you need to consider the format, volume, velocity, and access criteria of your source data.

If the data ingestion has issues, every following stage suffers. Inaccurate data results in erroneous reports, spurious analytic results, and unreliable decisions.

Storage Capability – Centralized & Staging

While storage needs are specific to every enterprise, here are 6 key factors to consider when choosing the right data warehouse.

  • Cloud vs. On-prem: If most of your mission-critical databases are on-premises and not compatible with cloud-based data warehouses. Otherwise, you wouldn’t want to take on the stresses & strains that accompany on-prem infrastructure.
  • Implementation Cost & Time: All vendors have radically different calculations for computing power, storage, configurations, etc. So, do your due diligence on the pricing info. You also need to factor in the cost of the team handling the implementation. When you’re weighing the implementation time, make sure that your chosen data warehouse does not take months to implement. Cost is a decisive factor, but time is more crucial. A moderately costly data warehouse can prove to be insanely expensive if you wait longer to get the insights needed to outwit your competitors!
  • Tech Stack: If your business has invested heavily in a specific data tech stack and does not have a major chunk of information residing outside of it, then picking that ecosystem’s tech stack makes sense. For instance, if most of your solutions have an SQL Server backend and need a custom integration, odds are you’ll go with Azure!
  • Scalability: If you’re a fast-growing enterprise, you need to determine the current volume of data, how likely it is to grow, and if the data warehouse can expand with your growing business needs.
  • Ongoing Costs and Maintenance: Your ongoing costs can far outweigh the resources you allocate upfront. The costs you need to consider include staff time spent on performance tuning, storage and compute resources, and data warehouse maintenance cost
  • IT Support: Make sure to verify that your preferred tool comes with an online community and live support that is included in the pricing tier. Having instant access to IT support for prompt handling of IT issues is a real lifesaver.

The future of data lies in the cloud and we’re competent in working with Azure & AWS. Lay a solid foundation of a sound data infrastructure that allows you to extract the right insights to deliver growth & transformation for your business.

Data Warehousing

With data warehousing (DWH) as a service, you can build a common data model irrespective of the data sources and enhance their visibility for informed decision-making. Plus, you get the added advantage of a cloud service that can scale and downsize as your business needs change.

  • Categorize Business Processes Using a Bus Matrix: A bus matrix is both a project artifact and a design tool that simplifies the representation of the subject areas and dimensions associated with your DWH. It acts as a guide to the design phase and provides a mechanism for communicating business information back in the overall architecture. It serves many purposes from communicating requirements, capabilities, and expectations with the business users down to the prioritization of tasks.
  • Define the Granularity: The grain defines the lowest level of detail for any table in the DWH. If a table contains daily marketing data, then it should be daily granularity. If it contains the sales data for each month, then it has monthly granularity. At this stage, you need to find answers to questions like:
    • Do you need to store sale information on a daily, weekly, monthly, or hourly basis? This decision is based on your reporting needs.
    • Do you need to store your entire product portfolio or just a few categories? This decision is based on the key business processes
  • Identify Dimensions and Attributes: Dimensions typically include specific details like products, dates, inventory, and store location. This is where all the data gets stored for a given duration which may range from a week, a month, or a year. Attributes are the different characteristics of the dimension in data modeling. In a store location dimension, the attributes can be state, zip code, and country. They are typically used for searching and classifying facts.
  • Identify Facts: This step is closely associated with business users as they get access to all the stored data in the warehouse from the fact table rows. Facts are numerical values like cost per unit, price per unit, etc. they help determine the sales for different product categories across locations daily.
  • Star Schema: An arrangement of tables in a manner that enables an accurate analysis of business performance. The star schema architecture resembles a star with the ends radiating from a central point. The center contains the table of facts, and the ends comprise dimension tables.

Rishabh’s Data Engineering Mix

As a part of our data engineering services, we help organizations to advance to the next level of data usage by providing data discovery & maturity assessment, data quality checks & standardization, cloud-based solutions for large volumes of information, batch data processing (with optimization of the database), data warehouse platforms and more. We help develop data architecture by integrating new & existing data sources to create more effective data lakes. Further, we can also integrate ETL pipelines, data warehouses, BI tools & governance processes.

Final Words

With data engineering as a service, every business can accelerate value creation from data collected, extract intelligence to improve strategies & optimize analytics to drive real-time decisions. The listed best practices would enable making your data pipelines consistent, robust, scalable, reliable, reusable & production ready. And, with that data consumers like data scientists can focus on science, instead of worrying about data management.

Since this stream, doesn’t have a wide range of well-established best practices like software engineering – you can work with a data engineering partner and benefit from their experience. They can help you achieve these goals by leveraging the right tech stack, on-premises architecture, or cloud platforms & integrating ETL pipelines, data warehouses, BI tools & governance processes. This would result in accurate, complete & error-free data that lays a solid groundwork for swift & seamless adoption of AI & analytics.

7 Women Web Designers Ready to Set the Web Design Industry on Fire

It’s the 21st century and women are storming the workforce more than ever before, yet there are very few women making a name in the predominately male web design industry.

Consider the 2007 A List Apart  web design survey results that found that out of nearly 33,000 professionals who participated, 82.8% were male while only 16.1% were female (1.1% gave no answer).

Despite the stats, what captured my interest was that in a relevantly male-dominated industry there are 7 women that stand out and amaze with their web designs. These successful designers have battled the odds and have come up on top in the design industry! The women listed here are from all over the world and are setting the web design industry on fire…right where it counts the most!

Why so few women in the web design industry?

The women in the web design industry are fighting the status quo and the statistics. They cannot support each other or find inspiration from other female designers because there are only a handful of woman who can do both artwork and coding. Consider that while 60% of graphic designers may be women only 3% are computer programmers.

So seriously Kudos to these Fab Seven who have made a name and paved the way for more women to become web designers.

1. Saifaa Shabir, India  (Up and Coming!)

Touted as the youngest female entrepreneur in Kashmir, India to revolutionize web design, Saifaa has a company under the name Sysarche-e. She aims to create an effective web system and standardize the procedure, process, method, and produce a consistent, measurable web system in Kashmir. She will offer services in graphic designing including logos, brochures, brand and corporate identity, packaging design, identity guidelines and Communication designs, which include animation and 3D graphics.

2. Zsofi Koller, Canada 

Owner and creator director of Tangerine Sky Designs located in Lunenburg Canada, Zsofi provides women entrepreneurs with clean, open web designs with intelligent company branding that identifies the business for their customers. Her company is all about confidently expressing your brand by helping you make smart decisions. Zsofi provides full package branding, logo, web design and sometimes baking recipes.

3. Irene Nam, Republic of Korea

Irene previously worked in the United States at IBT Media, then decided to give it a go to a freelance career. The verdict is – she loves going solo. Her clients are happy with her and considers her work creative, professional and interactive. Her core strengths lie in web design and user interaction design where she seamlessly adds flow, depth and visual style to create unique layouts in her projects

4. Gisele Jaquenod, Norway  

Gisele has been seen on Lee Monroe, Smashing Magazine, and Vandelay Design repeatedly for her blog designs for small businesses. With a Bachelor in Visual Communication Design, she taught designing until 2009 when she moved from Argentina to Norway. Her designs pop with colorful art work and cute features and she has even created several free blog templates. Her customers compliment her on her patience, keen eye for detail, and working quickly and efficiently once she helps you with your creative brief. Gisele understands the nuances of color and shape, and how these factors will impact the feel and tone of your brand.

5. Janna Hagan, Canada 

Janna the .net Young Designer of the Year Winner (2011) is on the list for 30-Must Follow Twitter accounts for web designers. She is a user interface designer for website and mobile application designs and makes them unique to reflect your brand. She has clients from all over the world some of the most notable being the National Breast Cancer Foundation of America, Consumer Media Network, and East London Royals Professional Baseball team. Janna’s background is in both design and marketing which brings a great mix to each new project.

6. Veerle Pieters, Belgium Follow Veerle on Twitter: @vpieters

Veerle has been the CEO of the company Duoh! since 1992 and chooses projects based on how well she connects with a company. Notable projects she has worked on in the past were Expression Engine 2.0’s GUI and the Library of Congress with interactive mini-websites. She totes that she became known in the industry because of her tutorials on CSS presented in her personal journal. She also runs panels at the SXSWi convention and in 2008 ended up on the NxE’s list of Fifty Most Influential Female Bloggers.

7. Inayaili de León Persson, England 

Inayaili is the Lead Web Designer at Canonical, the company that delivers Ubuntu. She is a web designer with 8 years of industry experience, specializes in cross-browser, semantic HTML and CSS, and clean, functional design. She has written numerous articles about design in Smashing Magazine, London Chronicles, and A List Apart. Inayaili has spoken at several conferences including the London Web Standards 2011 and Smashing Conference 2013.

The seven women may not be well known but the contracts they have nabbed suggests they are quickly making their way to the top in an industry that is highly competitive. They are inspirations to all those other women who are contemplating careers and don’t even think about web design because it is not a ‘women’s’ field. Hello to you all. The above list proves the world bows to those who do!

Know of other women out there who are making it in the web design industry? Let us know and we will try to highlight them in another post!

Why Websites Are Important

In today’s digital age, having a website is essential for any business or organization. A website is the most effective way to reach potential customers, build brand recognition, and establish credibility. Here are just a few of the reasons why it’s important to have a website:

1. Reach More Customers: A website allows you to reach more customers than ever before. It gives you the opportunity to showcase your products and services to a global audience, 24 hours a day, 7 days a week. With the right marketing strategies in place, you can increase your customer base and generate more sales.

2. Build Brand Recognition: A website helps you build brand recognition by providing an online presence for your business or organization. It allows you to create an identity that customers can recognize and trust. You can also use your website to showcase your products and services in an attractive way that will draw in potential customers.

3. Establish Credibility: Having a website helps establish credibility for your business or organization by providing potential customers with information about who you are and what you do. You can use your website to share customer testimonials, case studies, awards, certifications, and other credentials that demonstrate why customers should trust you with their business.

4. Improve Customer Service: A website allows you to provide better customer service by giving customers access to information about your products and services at any time of day or night. You can also use it as an avenue for customer feedback so that you can make improvements based on their suggestions or complaints.

5. Generate Leads: A website is one of the most effective ways to generate leads for your business or organization because it allows potential customers to find out more about what you offer without having to contact you directly first. You can also use it as an avenue for collecting contact information from visitors so that you can follow up with them later on down the line when they’re ready to make a purchase decision.

Having a website is essential in today’s digital world because it gives businesses and organizations the opportunity to reach more customers, build brand recognition, establish credibility, improve customer service, and generate leads all at once!

Protect your self while browsing the web

Are you looking for a way to protect your online privacy and security? If so, then you should consider using a Virtual Private Network (VPN). A VPN is a secure connection that allows you to access the internet without having to worry about your data being exposed. Here are some of the reasons why you should use a VPN.

1. Protect Your Privacy: A VPN encrypts all of your data, making it impossible for anyone to track or monitor your online activities. This means that even if someone were to intercept your data, they wouldn’t be able to read it or understand what you’re doing online. This is especially important if you’re using public Wi-Fi networks, as these are often unsecured and can be easily hacked.

2. Access Blocked Content: Many countries have restrictions on what content can be accessed online. With a VPN, you can bypass these restrictions and access any website or service that may be blocked in your country. This is especially useful if you’re traveling abroad and want to access services like Netflix or Hulu that may not be available in the country you’re visiting.

3. Save Money: Many websites offer discounts for users who connect through a VPN. By connecting through a VPN, you can often get cheaper prices on flights, hotels, and other services than those who don’t use one.

4. Improved Security: When using public Wi-Fi networks, it’s easy for hackers to intercept your data and steal sensitive information like passwords and credit card numbers. With a VPN, all of your data is encrypted so even if someone were to intercept it, they wouldn’t be able to read it or understand what you’re doing online.

Using a VPN is an easy way to protect yourself from hackers and other cyber criminals while also giving yourself access to content that may otherwise be blocked in your country. So if you want to stay safe online and save money while doing so, then consider using a Virtual Private Network today!