EXCLUSIVE INTERVIEW

Rewards for Recycling: Q&A With Gazelle CEO Israel Ganot, Part 2

Rewards for Recycling: Q&A With Gazelle CEO Israel Ganot, Part 1

In the fast-moving world of consumer electronics, last year’s gaming system and smartphone are old news. Luckily, however, they’re not entirely worthless.

Electronics recommerce company Gazelle buys this equipment, offering consumers cash, as well as free packaging and shipping — and then resells it for a profit.

In Part 2 of his exclusive interview with the E-Commerce Times, Gazelle cofounder and CEO Israel Ganot talks about the company’s electronic recycling philosophy, business model and future.

E-Commerce Times: What exactly does “recommerce” mean?

Israel Ganot:

Recommerce is a new consumer model that rewards consumers for smart consumption.

ECT: Where did the name “Gazelle” come from?

Ganot:

We wanted something short and memorable. It’s the second fastest animal, and it has a connection to the environment. We also like the slogan, “Don’t just sell it, Gazelle it.” The tagline we’re using now is “Keep it moving.” Don’t let things get stale, give those devices a new life. It’s all about keeping things moving.

ECT: How does Gazelle work?

Ganot:

Most consumers engage with Gazelle when they upgrade to a new device. You’ll go to Gazelle, search for your specific model, answer questions about its physical condition, and we’ll make you an offer. If you accept the offer, we’ll send you packaging, we’ll pay for the shipping, and once we get the item in our facility, within a week the item will be received inspected, data wiped, and you’ll get paid with a check or through PayPal or an Amazon gift card — or you can donate the proceeds to a charity.

Another way you can engage is to go to a site like Walmart.com and go to their electronics trade-in and recycling, powered by Gazelle, and do it that way. If you go to Gazelle through the Walmart website, you’ll get a Walmart gift card.

ECT: What is Gazelle’s business model? How does it make money?

Ganot:

Once we collect these devices, we pay consumers, and that’s our inventory. The only difference between us is that a typical retailer buys from wholesalers, but we put together our inventory from consumers, and then we resell the product in the secondary market.

We sell in a lot of different places. Our biggest channels are eBay, Amazon, wholesale channels and international buyers. Demand for the product is insatiable. People that live on the coasts want the latest and the greatest products. We buy on the coasts, and then we sell to the middle of the country and to developing markets.

Our biggest challenge as a business is buying more inventory, and to buy more inventory it’s all about educating consumers about recommerce.

ECT: What’s the benefit of using Gazelle as opposed to other similar services?

Ganot:

We are seeing a lot of competition in terms of other companies providing trade-in services, mostly coming from retailers and e-commerce retailers. It’s extremely positive, since the biggest challenge in our business is awareness.

Ultimately, it will help change consumer behavior. What’s different about Gazelle is our customer experience, which is the best in the industry. It’s all about delivering the customer experience every single day to every customer: how we handle the communication, free shipping, free packaging, and the way we communicate with our users. We see our users coming back to us over and over again.

ECT: What role has social media played in promoting and growing Gazelle?

Ganot:

The primary way we use social media is for customer care, mostly using Twitter and Facebook. We also give our users the tools they need to evangelize, and we’ve seen a lot of tweets about Gazelle. We’ve also seen the growth of YouTube videos. Gadget Lab, etc., providing videos for fixing devices, etc.

ECT: Are there any safety or privacy concerns with selling electronics on Gazelle?

Ganot:

That’s one of the most important services we provide — wiping electronics that customers trade. When consumers send their items to Gazelle, they know their data is safe. Consumers rely on that. It’s part of our brand and our trust. It’s about doing it every single day. We know that every data breach would affect our brand.

ECT: How is Gazelle evolving? What’s in the future for the company?

Ganot:

There are two areas in which we’re going to invest. In terms of sheer growth, we’re starting to invest in mass media channels, with radio and TV ads, and really getting the message out is a major area of opportunity.

Number two is working with our retail partners in bringing the service into the retail environment, and we’ll be rolling out more retail partners over the next year. We’re also building the infrastructure of the company — something that needs to scale with the growth of the business. We’re also thinking about international expansion and other categories we can move into.

Freelance writer Vivian Wagner has wide-ranging interests, from technology and business to music and motorcycles. She writes features regularly for ECT News Network, and her work has also appeared in American Profile, Bluegrass Unlimited, and many other publications. For more about her, visit her website.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Vivian Wagner
More in Exclusives

E-Commerce Times Channels

EXCLUSIVE INTERVIEW

The Business Case for Clean Data and Governance Planning

Do you know if your company’s data is clean and well managed? Why does that matter anyway?

Without a working governance plan, you might not have a company to worry about — data-wise.

Data governance is a collection of practices and processes establishing the rules, policies, and procedures that ensure data accuracy, quality, reliability, and security. It ensures the formal management of data assets within an organization.

Everyone in business understands the need to have and use clean data. But ensuring that it is clean and usable is a big challenge, according to David Kolinek, vice president of product management at Ataccama.

That challenge is even greater when business users must rely on scarce technical resources. Often, no one person oversees data governance, or that individual lacks a complete understanding of how the data will be used and how to clean it.

This is where Ataccama comes into play. The company’s mission provides a solution that even people without technical knowledge, such as SQL skills, can use to find the data they need, evaluate its quality, understand how to fix any issues, and determine whether that data will serve their purposes.

“With Ataccama, business users don’t need to involve IT to manage, access, and clean their data,” Kolinek told TechNewsWorld.

Keeping Users in Mind

Ataccama was founded in 2007 and basically bootstrapped.

It started as a part of Adastra, a consulting company, which is still in business today. However, Ataccama’s was focused on software rather than consulting. So management spun off that operation as a product company that addresses data quality issues.

Ataccama started with a basic approach — an engine that performed basic data cleansing and transformation. But this still required an expert user because of the user-provided configuration.

“So, we added a visual presentation for the steps that enable data transformation and things like cleansing. This made it a low-code platform since the users were able to do the majority of the work just by using the application user interface. But it was still a thick-client platform,” Kolinek explained.

The current version, however, is designed with a non-technical user in mind. The software includes a thin client, a focus on automation, and an easy-to-use interface.

“But what really stands out is the user experience, which is built off the seamless integration we were able to achieve with the 13th version of our engine. It delivers robust performance that’s tuned to perfection,” he offered.

Digging Deeper Into Data Management Issues

I asked Kolinek to discuss the data governance and quality issues further. Here is our conversation.

TechNewsWorld: How does Ataccama’s concept of centralizing or consolidating data management differ from other cloud systems such as Microsoft, Salesforce, AWS, and Google Cloud?

David Kolinek: We are platform agnostic and do not target one specific technology. Microsoft and AWS have their own native solutions that work well, but only within their own infrastructure. Our portfolio is wide open so it can serve all the use cases that must be covered across any infrastructure.

Further, we have data processing capabilities that not all cloud providers possess. Metadata is useful for automated processing, generating more metadata, which in turn can be used for additional analytics.

We developed both of these technologies in-house so we can provide native integration. As a result, we can deliver a superior user experience and a whole lot of automation.

How is this concept different from the notion of standardization of data?

David Kolinek
David Kolinek
VP of Product Management,
Ataccama

Kolinek: Standardization is just one of many things we do. Usually, standardization can be easily automated, the same way we can automate cleansing or data enrichment. We also provide manual data correction when solving some issues, like a missing social security number.

We cannot generate the SSN, but we could come up with a date of birth from other information. So, standardization is not different. It is a subset of things that improve quality. But for us, it is not only about data standardization. It is about having good quality data so information can be properly leveraged.

How does Ataccama’s data management platform benefit users?

Kolinek: The user experience is really our biggest benefit, and the platform is ideal for handling multiple personas. Companies need to enable both business users and IT people when it comes to data management. That requires a solution for business and IT to collaborate.

Another enormous benefit of our platform is the strong synergy between data processing and metadata management it provides.

The majority of other data management vendors cover only one of these areas. We also use machine learning and a rules-based approach and validation/standardization, which, again, are often not both supported by other vendors.

Also, because we are technology agnostic, users can connect to many different technologies from the same platform. With edge processing, for instance, you can configure something once in Ataccama ONE, and the platform will translate it for different platforms.

Does Ataccama’s platform lock-in users the way proprietary software often does?

Kolinek: We developed all the core components of the platform ourselves. They are tightly integrated together. There has been a huge wave of acquisitions lately in this space, with big vendors buying smaller ones to fill in gaps. In some cases, you are not really buying and managing one platform, but many.

With Ataccama, you can purchase just one module, like data quality/standardization, and later expand to others, such as master data management (MDM). It all works together seamlessly. Just activate our modules as you need them. This makes it easy for customers to start small and expand when the time is right.

Why is a unified data platform so important in this process?

Kolinek: The biggest benefit of a unified platform is that companies are not looking for a point solution to solve just a single problem, like data standardization. It is all interconnected.

For instance, to standardize you must validate the quality of the data, and for that, you must first find and catalog it. If you have an issue, even though it may look like a discrete problem, it more than likely involves many other aspects of data management.

The beauty of a unified platform is that in most use cases, you have one solution with native integration, and you can start using other modules.

What role do AI and ML play today in data governance, data quality, and master data management? How is it changing the process?

Kolinek: Machine learning enables customers to be more proactive. Previously, you would identify and report an issue. Someone would have to investigate what went awry and see if there was something wrong with the data. Then you would create a rule for data quality to prevent a recurrence. That is all reactive and is based on something breaking down, being found, reported, and then fixed.

Again, ML lets you be proactive. You give it training data instead of rules. The platform then detects differences in patterns and identifies anomalies to alert you before you even realized there was an issue. This is not possible with a rules-based approach, and it is much easier to scale if you have huge amounts of data sources. The more data you have, the better the training and its accuracy will be.

Other than cost savings, what benefits can enterprises gain through consolidating their data repositories? For instance, does it improve security, CX outcomes, etc.?

Kolinek: It does improve security and mitigates potential future leaks. For example, we had customers who were storing data that no one was using. In many cases, they did not even know the data existed! Now, they are not only unifying their technology stack, but they can also see all the stored data.

Onboarding new people onto the platform is also much easier with consolidated data. The more transparent the environment, the sooner people can use it and start gaining value.

It is not so much about saving money as it is about leveraging all your data to generate a competitive advantage and generate additional revenue. It provides data scientists with the means to build things that will advance the business.

What are the steps in adopting a data management platform?

Kolinek: Begin with the initial analysis. Focus on the biggest issues the company wants to tackle and select the platform modules to address them. Defining goals is key at this stage. What KPIs do you want to target? What level of ID do you want to achieve? These are questions you need to ask.

Next, you need a champion to advance execution and identify the main stakeholders who could drive the initiative. That requires extensive communications among different stakeholders, so it is vital to have someone focused on educating others about the benefits and helping teams onboard the system. Then comes the implementation phase where you address the key issues identified in the analysis, followed by rollout.

Finally, think about the next set of issues that need to be addressed, and if needed, enable additional modules in the platform to achieve those goals. The worst thing to do is purchase a tool and provide it, but offer no service, education, or support. This will ensure that adoption will be low. Education, support, and service are very important for the adoption phase.

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics. Email Jack.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Jack M. Germain
More in Cloud Computing