亚洲视频二区_亚洲欧洲日本天天堂在线观看_日韩一区二区在线观看_中文字幕不卡一区

公告:魔扣目錄網為廣大站長提供免費收錄網站服務,提交前請做好本站友鏈:【 網站目錄:http://www.430618.com 】, 免友鏈快審服務(50元/站),

點擊這里在線咨詢客服
新站提交
  • 網站:51998
  • 待審:31
  • 小程序:12
  • 文章:1030137
  • 會員:747

1. 引入

Apache Hudi支持多種分區方式數據集,如多級分區、單分區、時間日期分區、無分區數據集等,用戶可根據實際需求選擇合適的分區方式,下面來詳細了解Hudi如何配置何種類型分區。

2. 分區處理

為說明Hudi對不同分區類型的處理,假定寫入Hudi的Schema如下

{
  "type" : "record",
  "name" : "HudiSchemaDemo",
  "namespace" : "hoodie.HudiSchemaDemo",
  "fields" : [ {
    "name" : "age",
    "type" : [ "long", "null" ]
  }, {
    "name" : "location",
    "type" : [ "string", "null" ]
  }, {
    "name" : "name",
    "type" : [ "string", "null" ]
  }, {
    "name" : "sex",
    "type" : [ "string", "null" ]
  }, {
    "name" : "ts",
    "type" : [ "long", "null" ]
  }, {
    "name" : "date",
    "type" : [ "string", "null" ]
  } ]
}

其中一條具體數據如下

{
  "name": "zhangsan", 
  "ts": 1574297893837, 
  "age": 16, 
  "location": "beijing", 
  "sex":"male", 
  "date":"2020/08/16"
}

2.1 單分區

單分區表示使用一個字段表示作為分區字段的場景,可具體分為非日期格式字段(如location)和日期格式字段(如date)

2.1.1 非日期格式字段分區

如使用上述location字段作為分區字段,在寫入Hudi并同步至Hive時配置如下

df.write().format("org.apache.hudi").
                options(getQuickstartWriteConfigs()).
                option(DataSourceWriteOptions.TABLE_TYPE_OPT_KEY(), "COPY_ON_WRITE").
                option(DataSourceWriteOptions.PRECOMBINE_FIELD_OPT_KEY(), "ts").
                option(DataSourceWriteOptions.RECORDKEY_FIELD_OPT_KEY(), "name").
                option(DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY(), partitionFields).
                option(DataSourceWriteOptions.KEYGENERATOR_CLASS_OPT_KEY(), keyGenerator).
                option(TABLE_NAME, tableName).
                option("hoodie.datasource.hive_sync.enable", true).
                option("hoodie.datasource.hive_sync.table", tableName).
                option("hoodie.datasource.hive_sync.username", "root").
                option("hoodie.datasource.hive_sync.password", "123456").
                option("hoodie.datasource.hive_sync.jdbcurl", "jdbc:hive2://localhost:10000").
                option("hoodie.datasource.hive_sync.partition_fields", hivePartitionFields).
                option("hoodie.datasource.write.table.type", "COPY_ON_WRITE").
                option("hoodie.embed.timeline.server", false).
                option("hoodie.datasource.hive_sync.partition_extractor_class", hivePartitionExtractorClass).
                mode(saveMode).
                save(basePath);

值得注意如下幾個配置項

  • DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY()配置為location;
  • hoodie.datasource.hive_sync.partition_fields配置為location,與寫入Hudi的分區字段相同;
  • DataSourceWriteOptions.KEYGENERATOR_CLASS_OPT_KEY()配置為org.apache.hudi.keygen.SimpleKeyGenerator,或者不配置該選項,默認為org.apache.hudi.keygen.SimpleKeyGenerator
  • hoodie.datasource.hive_sync.partition_extractor_class配置為org.apache.hudi.hive.MultiPartKeysValueExtractor

Hudi同步到Hive創建的表如下

CREATE EXTERNAL TABLE `notdateformatsinglepartitiondemo`(
  `_hoodie_commit_time` string,
  `_hoodie_commit_seqno` string,
  `_hoodie_record_key` string,
  `_hoodie_partition_path` string,
  `_hoodie_file_name` string,
  `age` bigint,
  `date` string,
  `name` string,
  `sex` string,
  `ts` bigint)
PARTITIONED BY (
  `location` string)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT
  'org.apache.hudi.hadoop.HoodieParquetInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION
  'file:/tmp/hudi-partitions/notDateFormatSinglePartitionDemo'
TBLPROPERTIES (
  'last_commit_time_sync'='20200816154250',
  'transient_lastDdlTime'='1597563780')

查詢表notdateformatsinglepartitiondemo

tips: 查詢時請先將hudi-hive-sync-bundle-xxx.jar包放入$HIVE_HOME/lib下

詳解Apache Hudi如何配置各種類型分區

 

2.1.2 日期格式分區

如使用上述date字段作為分區字段,核心配置項如下

  • DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY()配置為date;
  • hoodie.datasource.hive_sync.partition_fields配置為date,與寫入Hudi的分區字段相同;
  • DataSourceWriteOptions.KEYGENERATOR_CLASS_OPT_KEY()配置為org.apache.hudi.keygen.SimpleKeyGenerator,或者不配置該選項,默認為org.apache.hudi.keygen.SimpleKeyGenerator
  • hoodie.datasource.hive_sync.partition_extractor_class配置為org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor

Hudi同步到Hive創建的表如下

CREATE EXTERNAL TABLE `dateformatsinglepartitiondemo`(
  `_hoodie_commit_time` string,
  `_hoodie_commit_seqno` string,
  `_hoodie_record_key` string,
  `_hoodie_partition_path` string,
  `_hoodie_file_name` string,
  `age` bigint,
  `location` string,
  `name` string,
  `sex` string,
  `ts` bigint)
PARTITIONED BY (
  `date` string)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT
  'org.apache.hudi.hadoop.HoodieParquetInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION
  'file:/tmp/hudi-partitions/dateFormatSinglePartitionDemo'
TBLPROPERTIES (
  'last_commit_time_sync'='20200816155107',
  'transient_lastDdlTime'='1597564276')

查詢表dateformatsinglepartitiondemo

詳解Apache Hudi如何配置各種類型分區

 

2.2 多分區

多分區表示使用多個字段表示作為分區字段的場景,如上述使用location字段和sex字段,核心配置項如下

  • DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY()配置為location,sex;
  • hoodie.datasource.hive_sync.partition_fields配置為location,sex,與寫入Hudi的分區字段相同;
  • DataSourceWriteOptions.KEYGENERATOR_CLASS_OPT_KEY()配置為org.apache.hudi.keygen.ComplexKeyGenerator
  • hoodie.datasource.hive_sync.partition_extractor_class配置為org.apache.hudi.hive.MultiPartKeysValueExtractor

Hudi同步到Hive創建的表如下

CREATE EXTERNAL TABLE `multipartitiondemo`(
  `_hoodie_commit_time` string,
  `_hoodie_commit_seqno` string,
  `_hoodie_record_key` string,
  `_hoodie_partition_path` string,
  `_hoodie_file_name` string,
  `age` bigint,
  `date` string,
  `name` string,
  `ts` bigint)
PARTITIONED BY (
  `location` string,
  `sex` string)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT
  'org.apache.hudi.hadoop.HoodieParquetInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION
  'file:/tmp/hudi-partitions/multiPartitionDemo'
TBLPROPERTIES (
  'last_commit_time_sync'='20200816160557',
  'transient_lastDdlTime'='1597565166')

查詢表multipartitiondemo

詳解Apache Hudi如何配置各種類型分區

 

2.3 無分區

無分區場景是指無分區字段,寫入Hudi的數據集無分區。核心配置如下

  • DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY()配置為空字符串;
  • hoodie.datasource.hive_sync.partition_fields配置為空字符串,與寫入Hudi的分區字段相同;
  • DataSourceWriteOptions.KEYGENERATOR_CLASS_OPT_KEY()配置為org.apache.hudi.keygen.NonpartitionedKeyGenerator
  • hoodie.datasource.hive_sync.partition_extractor_class配置為org.apache.hudi.hive.NonPartitionedExtractor

Hudi同步到Hive創建的表如下

CREATE EXTERNAL TABLE `nonpartitiondemo`(
  `_hoodie_commit_time` string,
  `_hoodie_commit_seqno` string,
  `_hoodie_record_key` string,
  `_hoodie_partition_path` string,
  `_hoodie_file_name` string,
  `age` bigint,
  `date` string,
  `location` string,
  `name` string,
  `sex` string,
  `ts` bigint)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT
  'org.apache.hudi.hadoop.HoodieParquetInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION
  'file:/tmp/hudi-partitions/nonPartitionDemo'
TBLPROPERTIES (
  'last_commit_time_sync'='20200816161558',
  'transient_lastDdlTime'='1597565767')

查詢表nonpartitiondemo

詳解Apache Hudi如何配置各種類型分區

 

2.4 Hive風格分區

除了上述幾種常見的分區方式,還有一種Hive風格分區格式,如location=beijing/sex=male格式,以location,sex作為分區字段,核心配置如下

  • DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY()配置為location,sex;
  • hoodie.datasource.hive_sync.partition_fields配置為location,sex,與寫入Hudi的分區字段相同;
  • DataSourceWriteOptions.KEYGENERATOR_CLASS_OPT_KEY()配置為org.apache.hudi.keygen.ComplexKeyGenerator
  • hoodie.datasource.hive_sync.partition_extractor_class配置為org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
  • DataSourceWriteOptions.HIVE_STYLE_PARTITIONING_OPT_KEY()配置為true

生成的Hudi數據集目錄結構會為如下格式

/location=beijing/sex=male

Hudi同步到Hive創建的表如下

CREATE EXTERNAL TABLE `hivestylepartitiondemo`(
  `_hoodie_commit_time` string,
  `_hoodie_commit_seqno` string,
  `_hoodie_record_key` string,
  `_hoodie_partition_path` string,
  `_hoodie_file_name` string,
  `age` bigint,
  `date` string,
  `name` string,
  `ts` bigint)
PARTITIONED BY (
  `location` string,
  `sex` string)
ROW FORMAT SERDE
  'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT
  'org.apache.hudi.hadoop.HoodieParquetInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION
  'file:/tmp/hudi-partitions/hiveStylePartitionDemo'
TBLPROPERTIES (
  'last_commit_time_sync'='20200816172710',
  'transient_lastDdlTime'='1597570039')

查詢表hivestylepartitiondemo

詳解Apache Hudi如何配置各種類型分區

 

3. 總結

本篇文章介紹了Hudi如何處理不同分區場景,上述配置的分區類配置可以滿足絕大多數場景,當然Hudi非常靈活,還支持自定義分區解析器,具體可查看KeyGenerator和PartitionValueExtractor類,其中所有寫入Hudi的分區字段生成器都是KeyGenerator的子類,所有同步至Hive的分區值解析器都是PartitionValueExtractor的子類。

分享到:
標簽:Apache Hudi
用戶無頭像

網友整理

注冊時間:

網站:5 個   小程序:0 個  文章:12 篇

  • 51998

    網站

  • 12

    小程序

  • 1030137

    文章

  • 747

    會員

趕快注冊賬號,推廣您的網站吧!
最新入駐小程序

數獨大挑戰2018-06-03

數獨一種數學游戲,玩家需要根據9

答題星2018-06-03

您可以通過答題星輕松地創建試卷

全階人生考試2018-06-03

各種考試題,題庫,初中,高中,大學四六

運動步數有氧達人2018-06-03

記錄運動步數,積累氧氣值。還可偷

每日養生app2018-06-03

每日養生,天天健康

體育訓練成績評定2018-06-03

通用課目體育訓練成績評定