Vertica Blog

Vertica Blog

Hadoop

Helpful Tips in blue text with magnifying glass

Viewing Parquet Export Events More Easily

The EXPORT TO PARQUET command exports a table, columns from a table, or query results to files in the Parquet format. When you run EXPORT TO PARQUET information about the files created during the export is stored in the Vertica log. It's no fun combing through a Vertica log looking for those particular records. Good...
Cubes with binary ones and zeroes on them clustered together floating in blue space with light shining out

Complex Data Types in SQL 2 – Benefits of SQL Support

Co-authored by James Clampffer, Deepak Majeti. In this second post in the blog series, let’s look at some of the reasons complex types are used extensively in organized big data formats like ORC and Parquet. Be sure to read the in this series where we discussed the , and explored the various complex types in...
3D render of a binary data stream flowing into a network server surrounded by clouds

Is Your Future Cloudy? This Buyer’s Guide Offers Clarity

When I started at Vertica about seven years ago, I was met by anxious sales people warning me that (HDFS, specifically) was going to completely replace the , as we knew it. And we should be very concerned. That was 2012, a year that I first encountered the lovable elephant and met swarms of Hadoop...
Cubes with binary ones and zeroes on them clustered together floating in blue space with light shining out

Complex Data Types in SQL 1 – What Are They?

Co-authored by James Clampffer, Deepak Majeti. A lot of projects require querying ORC or Parquet files, or other data that may have internal types that are a bit more complex than usual. We support some of those complex types now, and are dedicated to adding full support for all complex types in Vertica. It’s a...
Cloud pattern in circuit board

Vertica Ranked #1 Cloud Data Warehouse

July has been quite an exciting month for Vertica. Our three keynote presenters are now confirmed for the Vertica Big Data Conference 2020 – Vertica founder and Turing Award winner Dr. Michael Stonebraker, kingpin of the famous MIT blackjack team Jeffrey Ma, and renowned analyst Ray Wang of Constellation Research. And, in addition to receiving...
surprised data analyst

What You Never Knew About Vertica Could Surprise You

I just started working on the Vertica team. As the “new guy,” my first few weeks of work have been largely about cramming as much Vertica information into my brain as possible in the shortest time possible. That’s my favorite part of working in the big data analysis world. You always have to keep learning....

Exporting to Parquet: Quick Tip

Jim Knicely authored this tip. Vertica can export a table, columns from a table, or query results to files in the Parquet format. dbadmin=> EXPORT TO PARQUET (directory = '/home/dbadmin/dim') AS SELECT * FROM dim; Rows Exported --------------- 1 (1 row) One restriction is the path to export must not exist. How do I get...
Small chalkboard on a desk with Thank You written on it

Thanks, Google!

This week began with a compliment from Google that made me so proud on behalf of Vertica! On Wednesday, July 25, at the with two algorithms – linear and logistic regression – and also confirmed that these new machine learning functions could be accessed using standard SQL via Google BigQuery, opening the door for database...
House diagram showing sensors for lights, weather, security, internal temp, etc.

The Internet of Things (IoT) and Smart Metering

Every month, my husband receives a driver’s report from Chevrolet, providing him with “feedback” on his driving behavior – everything from the intensity of his braking to the number of lane changes without a blinker to the distance and speed he drove in his Silverado truck. The report also provides specific details on the upcoming...
Programmer

Saving an Apache Spark DataFrame to a Vertica Table

Before you save an Apache Spark DataFrame to a Vertica table, make sure that you have the following setup: • Vertica cluster • Spark cluster • HDFS cluster. The Vertica Spark connector uses HDFS as an intermediate storage before it writes the DataFrame to Vertica. This checklist identifies potential problems you might encounter when using...

Can you tell us about your data lake?

It's the fourth round of Vertica product management surveys and we have really appreciated getting your feedback! In this survey, we want to know all about your data lake. We want to know what tools you use, how much data is in your lake, and the types of workloads you are running. We are hoping...
Diagram of Vertica compute and storage together in Enterprise Mode and storage centralized, and compute separate in Eon Mode

Introducing the Eon Mode Concept

Vertica was born at a tipping point in the world of Enterprise Data Warehouses. It was designed from the first line of code to address new levels of data volumes and analytical performance. But it was also designed to break the tightly integrated hardware and software appliances offered by industry leaders at the time including...