Posts under category Google

I have this GAQL:

  var query = 'SELECT \     customer.id,\     customer.descriptive_name,\     group_placement_view.placement_type,\     group_placement_view.display_name,\     metrics.average_cpm\  FROM group_placement_view\  WHERE\     group_placement_view.placement_type IN ("YOUTUBE_CHANNEL")\     AND campaign.advertising_channel_type = "VIDEO"\     AND segments.date BETWEEN "'+ fromDate.query_date + '" AND "' + toDate.query_date + '"\     AND metrics.impressions >= 100\     AND metrics.average_cpm > 1000000' 

IIUC it segments data by days. Correct? segments.date

But the metrics.impressions >= 100 relates to the aggregated data of the whole period. Right?

$requestOptionalArgs = []; $requestOptionalArgs['keywordSeed'] = new KeywordSeed(['keywords' => $keywords]); $keywordPlanIdeaServiceClient->generateKeywordIdeas([                                 'language' => ResourceNames::forLanguageConstant(1000), // English                                 'customerId' => $customerId,                                 'geoTargetConstants' => $geoTargetConstants,                                 'keywordPlanNetwork' => KeywordPlanNetwork::GOOGLE_SEARCH                             ] + $requestOptionalArgs); 

The above code is working fine if the $keywords array size is not more than 20. If I add the 21st keyword to the $keywords array then it's throwing the below error. keyword_plan_idea_error: The input has an invalid value.

I was making some changes to a Wordpress page, I have access to the CPanel. I didn't realize until the next day that this text appeared above the site header: "gtag('config', 'AW-XXXXXXXXX'); "

show image

From what I've researched, it's the global site tag for conversion tracking in Google Ads, but I don't know how to hide it or why it appeared.

Does anyone know how to hide it and where it is located? Please

I am trying to migrate from Google-AdWords to google-ads-v10 API in spark 3.1.1 in EMR. I am facing some dependency issues due to conflicts with existing jars. Initially, we were facing a dependency related to Protobuf jar:

Exception in thread "grpc-default-executor-0" java.lang.IllegalAccessError: tried to access field com.google.protobuf.AbstractMessage.memoizedSize from class com.google.ads.googleads.v10.services.SearchGoogleAdsRequest     at com.google.ads.googleads.v10.services.SearchGoogleAdsRequest.getSerializedSize(SearchGoogleAdsRequest.java:394)     at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108) 

In order to resolve this, tried to shade the Protobuf jar and have a uber-jar instead. After the shading, running my project locally in IntelliJ works fine, But when trying to run an executable jar I created I get the following error:

Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional channel service provider found. Try adding a dependency on the grpc-okhttp, grpc-netty, or grpc-netty-shaded artifact 

I tried adding all those libraries in --spark.jars.packages but it didn't help.

java.lang.VerifyError: Operand stack overflow Exception Details:   Location:     io/grpc/internal/TransportTracer.getStats()Lio/grpc/InternalChannelz$TransportStats; ... ... ...     at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.<init>(NettyChannelBuilder.java:96)     at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forTarget(NettyChannelBuilder.java:169)     at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forAddress(NettyChannelBuilder.java:152)     at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:38)     at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:24)     at io.grpc.ManagedChannelBuilder.forAddress(ManagedChannelBuilder.java:39)     at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:348) 

Has anyone ever encountered such an issue?

Build.sbt

 lazy val dependencies = new {   val sparkRedshift = "io.github.spark-redshift-community" %% "spark-redshift" % "5.0.3" % "provided" excludeAll (ExclusionRule(organization = "com.amazonaws"))   val jsonSimple = "com.googlecode.json-simple" % "json-simple" % "1.1" % "provided"   val googleAdsLib = "com.google.api-ads" % "google-ads" % "17.0.1"   val jedis = "redis.clients" % "jedis" % "3.0.1" % "provided"   val sparkAvro = "org.apache.spark" %% "spark-avro" % sparkVersion % "provided"   val queryBuilder = "com.itfsw" % "QueryBuilder" % "1.0.4" % "provided" excludeAll (ExclusionRule(organization = "com.fasterxml.jackson.core"))   val protobufForGoogleAds = "com.google.protobuf" % "protobuf-java" % "3.18.1"   val guavaForGoogleAds = "com.google.guava" % "guava" % "31.1-jre" } libraryDependencies ++= Seq(   dependencies.sparkRedshift, dependencies.jsonSimple, dependencies.googleAdsLib,dependencies.guavaForGoogleAds,dependencies.protobufForGoogleAds   ,dependencies.jedis, dependencies.sparkAvro,   dependencies.queryBuilder ) dependencyOverrides ++= Set(   dependencies.guavaForGoogleAds ) assemblyShadeRules in assembly := Seq(   ShadeRule.rename("com.google.protobuf.**" -> "repackaged.protobuf.@1").inAll ) assemblyMergeStrategy in assembly := {  case PathList("META-INF", xs@_*) => MergeStrategy.discard  case PathList("module-info.class", xs@_*) => MergeStrategy.discard  case x => MergeStrategy.first } 

I am trying to upload user data using a google ads manager test account. The api I am trying to hit is 'https://googleads.googleapis.com/v10/customers/{customerId}:uploadUserData' . However upon sending request I am getting the following error.

"errors": [                     {                         "errorCodenter code heree": {                             "requestError": "RESOURCE_NAME_MISSING"                         },                         "message": "Resource name is missing."                     }                 ], 

. Though I am giving all the credentials and my body is correct. Attached a postman screenshot below. enter image description here