You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 20, 2018. It is now read-only.
In my project I am trying to convert the Struct DF schema to Avro schema with SchemaConverters.convertStructToAvro . Does anybody have example how to use this converter ?
I am using Scala and
com.databricks
spark-avro_2.11
4.0.0
The text was updated successfully, but these errors were encountered:
val recordName: String = "Model" val recordNamespace: String = "Test" val sparkSession: SparkSession = SparkSession.builder().master("local[*]").getOrCreate() val builder: RecordBuilder[Schema] = SchemaBuilder.record(recordName).namespace(recordNamespace)
def serialize(): Unit = { import sparkSession.implicits._ val inputDF: DataFrame = sparkSession.sparkContext.parallelize(Seq(Model(Some(List("param1"))))).toDF() val structType: StructType = Encoders.product[Model].schema
val schema: Schema = SchemaConverters.convertStructToAvro(structType, builder, recordNamespace) val genericRecord: GenericRecord = getGenericRecord(inputDF.head())
avroSerialize(schema, genericRecord) }
case class Model(params: Option[List[String]])
def avroSerialize(schema: Schema, genericRecord: GenericRecord): Try[Array[Byte]] = { Try { val writer = new SpecificDatumWriter[GenericRecord](schema) val out = new ByteArrayOutputStream() val encoder = EncoderFactory.get().binaryEncoder(out, null) writer.write(genericRecord, encoder) encoder.flush() out.close() out.toByteArray } }
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
In my project I am trying to convert the Struct DF schema to Avro schema with SchemaConverters.convertStructToAvro . Does anybody have example how to use this converter ?
I am using Scala and
com.databricks
spark-avro_2.11
4.0.0
The text was updated successfully, but these errors were encountered: