[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] Spark ingestion of simplefeature type with no geom

Le 21/05/2019 à 14:32, Emilio Lahr-Vivaz a écrit :
Hello,

I don't actually see anywhere that looks like it requires a geometry - 
can you provide a code snippet and/or a stack trace of the error?

Thanks,

Emilio

On 5/21/19 4:07 AM, GRISOT, REMI wrote:
Le 20/05/2019 à 17:48, Emilio Lahr-Vivaz a écrit :
Hello,

I believe that the spark code assumes a geometry field is present. While
the GeoMesa data stores do generally support schemas without a geometry,
no one has updated spark to do the same, as it's not really a primary
use case. Feel free to open a ticket and/or a pull request!

Thanks,

Emilio

On 5/20/19 11:36 AM, GRISOT, REMI wrote:
Hello,

I am trying to ingest some data into geomesa and one of the feature
types has no spatial field.
When I ingest the data without Spark I have no problem however if I try
to ingest data with Spark and no spatial attribute, it throws a
NullPointerException. The workaround I found for now is to set a (0,0)
point for all entries in that table however it seems to me it isn't the
proper way to do. Is there anything particular to do so that Spark can
ingest features with no geom field?

I use Geomesa v2.1.2.

Thank you!

Rémi

Ce message et toutes les pièces jointes (ci-après le "message") sont établis à l’intention exclusive des destinataires désignés. Il contient des informations confidentielles et pouvant être protégé par le secret professionnel. Si vous recevez ce message par erreur, merci d'en avertir immédiatement l'expéditeur et de détruire le message. Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse de l’émetteur. L'internet ne garantissant pas l'intégrité de ce message lors de son acheminement, Atos (et ses filiales) décline(nt) toute responsabilité au titre de son contenu. Bien que ce message ait fait l’objet d’un traitement anti-virus lors de son envoi, l’émetteur ne peut garantir l’absence totale de logiciels malveillants dans son contenu et ne pourrait être tenu pour responsable des dommages engendrés par la transmission de l’un d’eux.

This message and any attachments (the "message") are intended solely for the addressee(s). It contains confidential information, that may be privileged. If you receive this message in error, please notify the sender immediately and delete the message. Any use of the message in violation of its purpose, any dissemination or disclosure, either wholly or partially is strictly prohibited, unless it has been explicitly authorized by the sender. As its integrity cannot be secured on the internet, Atos and its subsidiaries decline any liability for the content of this message. Although the sender endeavors to maintain a computer virus-free network, the sender does not warrant that this transmission is virus-free and will not be liable for any damages resulting from any virus transmitted.
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users
Thank you for your answer. Could you please give me some informations on
the packages/files I should look at, as I've no knowledge of the project?

Rémi

Ce message et toutes les pièces jointes (ci-après le "message") sont établis à l’intention exclusive des destinataires désignés. Il contient des informations confidentielles et pouvant être protégé par le secret professionnel. Si vous recevez ce message par erreur, merci d'en avertir immédiatement l'expéditeur et de détruire le message. Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse de l’émetteur. L'internet ne garantissant pas l'intégrité de ce message lors de son acheminement, Atos (et ses filiales) décline(nt) toute responsabilité au titre de son contenu. Bien que ce message ait fait l’objet d’un traitement anti-virus lors de son envoi, l’émetteur ne peut garantir l’absence totale de logiciels malveillants dans son contenu et ne pourrait être tenu pour responsable des dommages engendrés par la transmission de l’un d’eux.

This message and any attachments (the "message") are intended solely for the addressee(s). It contains confidential information, that may be privileged. If you receive this message in error, please notify the sender immediately and delete the message. Any use of the message in violation of its purpose, any dissemination or disclosure, either wholly or partially is strictly prohibited, unless it has been explicitly authorized by the sender. As its integrity cannot be secured on the internet, Atos and its subsidiaries decline any liability for the content of this message. Although the sender endeavors to maintain a computer virus-free network, the sender does not warrant that this transmission is virus-free and will not be liable for any damages resulting from any virus transmitted.
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users

Hi Emilio,

Here is the stacktrace. You will find attached a piece of code that reproduce the error.

java.lang.NullPointerException
    at org.locationtech.geomesa.spark.GeoMesaRelation.<init>(GeoMesaSparkSQL.scala:240)
    at org.locationtech.geomesa.spark.GeoMesaDataSource.createRelation(GeoMesaSparkSQL.scala:208)
    at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
    at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)


Thank you for your help!

Rémi

Ce message et toutes les pièces jointes (ci-après le "message") sont établis à l’intention exclusive des destinataires désignés. Il contient des informations confidentielles et pouvant être protégé par le secret professionnel. Si vous recevez ce message par erreur, merci d'en avertir immédiatement l'expéditeur et de détruire le message. Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse de l’émetteur. L'internet ne garantissant pas l'intégrité de ce message lors de son acheminement, Atos (et ses filiales) décline(nt) toute responsabilité au titre de son contenu. Bien que ce message ait fait l’objet d’un traitement anti-virus lors de son envoi, l’émetteur ne peut garantir l’absence totale de logiciels malveillants dans son contenu et ne pourrait être tenu pour responsable des dommages engendrés par la transmission de l’un d’eux.

This message and any attachments (the "message") are intended solely for the addressee(s). It contains confidential information, that may be privileged. If you receive this message in error, please notify the sender immediately and delete the message. Any use of the message in violation of its purpose, any dissemination or disclosure, either wholly or partially is strictly prohibited, unless it has been explicitly authorized by the sender. As its integrity cannot be secured on the internet, Atos and its subsidiaries decline any liability for the content of this message. Although the sender endeavors to maintain a computer virus-free network, the sender does not warrant that this transmission is virus-free and will not be liable for any damages resulting from any virus transmitted.
void npe() {
	final SparkSession spark = SparkSession.builder().appName("spark NPE").master("local[*]").getOrCreate();
	final String sft = "date:Date," +
			"fid:Long:index=true";



	final java.util.Map<String, String> params = new HashMap<>();
	params.put(AccumuloDataStoreParams.InstanceIdParam().key, "instance");
	params.put(AccumuloDataStoreParams.ZookeepersParam().key, "zoo");
	params.put(AccumuloDataStoreParams.UserParam().key, "user");
	params.put(AccumuloDataStoreParams.PasswordParam().key, "azerty");
	params.put(AccumuloDataStoreParams.CatalogParam().key, "catalog");
	params.put(AccumuloDataStoreParams.AuthsParam().key, "test");
	params.put(AccumuloDataStoreParams.MockParam().key, "true");

	final Map<String, String> scalaParams = MapConverter
			.mutableToImmutable(JavaConverters.mapAsScalaMapConverter(params).asScala());

	final MockAccumuloDatastore accumuloDatastore = new MockAccumuloDatastore(scalaParams);

	accumuloDatastore.ds().createSchema(SimpleFeatureTypes.createType("features", sft));

	Dataset<Row> test = spark.createDataFrame(Arrays.asList(
			new Bean(1),
			new Bean(2),
			new Bean(3)
	), Bean.class).withColumn("date", to_date(to_timestamp(col("date"))));
	test.printSchema();
	test.show();
	test.write().format("geomesa").options(scalaParams).option("geomesa.feature", "features").save();

}

public class Bean implements Serializable{
	private long fid;
	private long date = new Date().getTime()/1000;

	public Bean(long id){
		setFid(id);
	}

	public Long getFid() {
		return fid;
	}

	public void setFid(Long id) {
		this.fid = id;
	}

	public long getDate() {
		return date;
	}

	public void setDate(long date) {
		this.date = date;
	}
}