java.lang.VerifyError: Operand stack overflow for google-ads API and SBT
I am trying to migrate from Google-AdWords to google-ads-v10 API in spark 3.1.1 in EMR. I am facing some dependency issues due to conflicts with existing jars. Initially, we were facing a dependency related to Protobuf jar:
Exception in thread "grpc-default-executor-0" java.lang.IllegalAccessError: tried to access field com.google.protobuf.AbstractMessage.memoizedSize from class com.google.ads.googleads.v10.services.SearchGoogleAdsRequest at com.google.ads.googleads.v10.services.SearchGoogleAdsRequest.getSerializedSize(SearchGoogleAdsRequest.java:394) at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108)
In order to resolve this, tried to shade the Protobuf jar and have a uber-jar instead. After the shading, running my project locally in IntelliJ works fine, But when trying to run an executable jar I created I get the following error:
Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional channel service provider found. Try adding a dependency on the grpc-okhttp, grpc-netty, or grpc-netty-shaded artifact
I tried adding all those libraries in --spark.jars.packages
but it didn't help.
java.lang.VerifyError: Operand stack overflow Exception Details: Location: io/grpc/internal/TransportTracer.getStats()Lio/grpc/InternalChannelz$TransportStats; ... ... ... at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.<init>(NettyChannelBuilder.java:96) at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forTarget(NettyChannelBuilder.java:169) at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forAddress(NettyChannelBuilder.java:152) at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:38) at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:24) at io.grpc.ManagedChannelBuilder.forAddress(ManagedChannelBuilder.java:39) at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:348)
Has anyone ever encountered such an issue?
Build.sbt
lazy val dependencies = new { val sparkRedshift = "io.github.spark-redshift-community" %% "spark-redshift" % "5.0.3" % "provided" excludeAll (ExclusionRule(organization = "com.amazonaws")) val jsonSimple = "com.googlecode.json-simple" % "json-simple" % "1.1" % "provided" val googleAdsLib = "com.google.api-ads" % "google-ads" % "17.0.1" val jedis = "redis.clients" % "jedis" % "3.0.1" % "provided" val sparkAvro = "org.apache.spark" %% "spark-avro" % sparkVersion % "provided" val queryBuilder = "com.itfsw" % "QueryBuilder" % "1.0.4" % "provided" excludeAll (ExclusionRule(organization = "com.fasterxml.jackson.core")) val protobufForGoogleAds = "com.google.protobuf" % "protobuf-java" % "3.18.1" val guavaForGoogleAds = "com.google.guava" % "guava" % "31.1-jre" } libraryDependencies ++= Seq( dependencies.sparkRedshift, dependencies.jsonSimple, dependencies.googleAdsLib,dependencies.guavaForGoogleAds,dependencies.protobufForGoogleAds ,dependencies.jedis, dependencies.sparkAvro, dependencies.queryBuilder ) dependencyOverrides ++= Set( dependencies.guavaForGoogleAds ) assemblyShadeRules in assembly := Seq( ShadeRule.rename("com.google.protobuf.**" -> "repackaged.protobuf.@1").inAll ) assemblyMergeStrategy in assembly := { case PathList("META-INF", xs@_*) => MergeStrategy.discard case PathList("module-info.class", xs@_*) => MergeStrategy.discard case x => MergeStrategy.first }
I had a similar issue and I changed the assembly merge strategy to this:
assemblyMergeStrategy in assembly := { case x if x.contains("io.netty.versions.properties") => MergeStrategy.discard case x => val oldStrategy = (assemblyMergeStrategy in assembly).value oldStrategy(x) }Solved this by using the google-ads-shadowjar as an external jar rather than having a dependency on google-ads library. This solves the problem of having to deal with dependencies manually but makes your jar size bigger.