Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auto-instrumentation: Incoming and outgoing HTTP requests create separate, uncorrelated traces #1200

Open
shravanv1 opened this issue Oct 15, 2024 · 11 comments
Labels
bug Something isn't working

Comments

@shravanv1
Copy link

shravanv1 commented Oct 15, 2024

Describe the bug

I am using OpenTelemetry auto-instrumentation in a Go application to trace both incoming and outgoing HTTP requests. However, while I can see traces for incoming requests, the outgoing requests are creating entirely new, uncorrelated traces instead of being part of the same trace as the incoming request.

Environment
Language: Go
Go Version: 1.23.1
OpenTelemetry Go Auto-Instrumentation Version: v0.15.0-alpha
OS: Linux (Amazon Linux 2023)
Kernel Version: 6.1.0
Tracing Backend: Jaeger

Code -

// this is handler for API -- /v1/thor/rest/flight/home/deals
func DailyStealDealHandlerChain() gin.HandlersChain {
	return gin.HandlersChain{
		baseRequestParserMiddleware(client.NewDealsBaseHelper),
		monitorHelper(handlers.NewMetricNameHelper(internal.MetricPrefixStealDeal)),
		authMiddleware(),
		requestParser(client.DailyStealDealRequestParser),
		pokusMiddleware(),
		getController(handlers.DailyStealDealHandler),
	}
}


// helper to create gin handlers from common handlers
func getController(handler func(icontext.IContext, http.ResponseWriter, *http.Request)) gin.HandlerFunc {
	return func(c *gin.Context) {
		iCtx := copyGinContext(c)
		generic_logger.NewLoggerWithUpdatedCtx(iCtx)
		handler(iCtx, c.Writer, c.Request)
		c.Writer.Flush()
	}
}

func copyGinContext(c *gin.Context) icontext.IContext {
	ctx := icontext.NewContext()
	for k, v := range c.Keys {
		ctx.SetValue(icontext.ContextKey(k), v)
	}
	for _, p := range c.Params {
		ctx.SetValue(icontext.ContextKey(p.Key), p.Value)
	}
	return ctx
}


func DailyStealDealHandler(ctx icontext.IContext, w http.ResponseWriter, req *http.Request) {
	dealReq := client.GetDailyStealDealRequest(ctx.GetBaseContext())
	var clientResp *client.DealsResponse
	var err error

	clientResp, err = dealHandlerV2(ctx, dealReq)

	if err != nil {
		client.SendError(w, err)
	}
	client.SendJsonRespnse(w, clientResp)
}

inside dealHandlerV2 I am making http call to another API flights-b2c-misc/v2/daily-steal-deals.* by craeting this

ctx, _ = context.WithTimeout(ctx, time.Duration(b.Timeout)*time.Millisecond)
	req, err := http.NewRequestWithContext(ctx, b.Method, b.Url, b.Body)

but for /v1/thor/rest/flight/home/deals I am getting another trace and for flights-b2c-misc/v2/daily-steal-deals.* I am getting different tarce ideally this sholud be same

@shravanv1 shravanv1 added the bug Something isn't working label Oct 15, 2024
@shravanv1
Copy link
Author

shravanv1 commented Oct 16, 2024

Hello can pls anyone provide update on this I tried multiple ways but still facing same issue @RonFed

@damemi
Copy link
Contributor

damemi commented Oct 16, 2024

Hi, can you please share the logs from the auto-instrumentation agent?

@shravanv1
Copy link
Author

@damemi pls find attaced logs

{
  "time": "2024-10-16T21:42:16.987830371+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 75
  },
  "msg": "getting tracer",
  "name": "Flights-GI-Backend-ts",
  "version": "",
  "schema": ""
}

{
  "time": "2024-10-16T21:42:16.994011898+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 56
  },
  "msg": "got event",
  "kind": "client",
  "pkg": "net/http",
  "attrs": [
    {
      "Key": "http.request.method",
      "Value": {
        "Type": "STRING",
        "Value": "POST"
      }
    },
    {
      "Key": "http.response.status_code",
      "Value": {
        "Type": "INT64",
        "Value": 200
      }
    },
    {
      "Key": "url.path",
      "Value": {
        "Type": "STRING",
        "Value": "/experiment/assignVariantBackend"
      }
    },
    {
      "Key": "url.full",
      "Value": {
        "Type": "STRING",
        "Value": "http://pokus-gi.mmt.mmt/experiment/assignVariantBackend"
      }
    },
    {
      "Key": "server.address",
      "Value": {
        "Type": "STRING",
        "Value": "pokus-gi.mmt.mmt"
      }
    },
    {
      "Key": "network.protocol.version",
      "Value": {
        "Type": "STRING",
        "Value": "1.1"
      }
    }
  ],
  "traceID": "999dee807db1ec854a11c5460d677d46",
  "spanID": "566a6db5dbc42071"
}

{
  "time": "2024-10-16T21:42:16.994057519+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 75
  },
  "msg": "getting tracer",
  "name": "",
  "version": "",
  "schema": "https://opentelemetry.io/schemas/1.26.0"
}
{
  "time": "2024-10-16T21:42:16.994412896+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 75
  },
  "msg": "getting tracer",
  "name": "Flights-GI-Backend-ts",
  "version": "",
  "schema": ""
}{
  "time": "2024-10-16T21:42:17.012567507+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 56
  },
  "msg": "got event",
  "kind": "client",
  "pkg": "net/http",
  "attrs": [
    {
      "Key": "http.request.method",
      "Value": {
        "Type": "STRING",
        "Value": "GET"
      }
    },
    {
      "Key": "http.response.status_code",
      "Value": {
        "Type": "INT64",
        "Value": 200
      }
    },
    {
      "Key": "url.path",
      "Value": {
        "Type": "STRING",
        "Value": "/flights-b2c-misc/v2/daily-steal-deals"
      }
    },
    {
      "Key": "url.full",
      "Value": {
        "Type": "STRING",
        "Value": "http://flights-proxy-service.ecs.mmt/flights-b2c-misc/v2/daily-steal-deals?city=DEL%2CBOM%2CGOI%2CHYD%2CMAA%2CBLR%2CCCU%2CSXR%2CTRV%2CAMD\u0026currency=inr\u0026language=eng\u0026region=in"
      }
    },
    {
      "Key": "server.address",
      "Value": {
        "Type": "STRING",
        "Value": "flights-proxy-service.ecs.mmt"
      }
    },
    {
      "Key": "network.protocol.version",
      "Value": {
        "Type": "STRING",
        "Value": "1.1"
      }
    }
  ],
  "traceID": "fc5c0c088229b958b83394fdc68abd5c",
  "spanID": "2e064b5142cce655"
}{
  "time": "2024-10-16T21:42:17.01267172+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 75
  },
  "msg": "getting tracer",
  "name": "",
  "version": "",
  "schema": "https://opentelemetry.io/schemas/1.26.0"
}{
  "time": "2024-10-16T21:42:17.014058278+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 56
  },
  "msg": "got event",
  "kind": "server",
  "pkg": "net/http",
  "attrs": [
    {
      "Key": "http.request.method",
      "Value": {
        "Type": "STRING",
        "Value": "POST"
      }
    },
    {
      "Key": "url.path",
      "Value": {
        "Type": "STRING",
        "Value": "/v1/thor/rest/flight/home/deals"
      }
    },
    {
      "Key": "http.response.status_code",
      "Value": {
        "Type": "INT64",
        "Value": 200
      }
    },
    {
      "Key": "network.peer.address",
      "Value": {
        "Type": "STRING",
        "Value": "127.0.0.1"
      }
    },
    {
      "Key": "network.peer.port",
      "Value": {
        "Type": "INT64",
        "Value": 49778
      }
    },
    {
      "Key": "server.address",
      "Value": {
        "Type": "STRING",
        "Value": "gicb"
      }
    },
    {
      "Key": "network.protocol.version",
      "Value": {
        "Type": "STRING",
        "Value": "1.1"
      }
    }
  ],
  "traceID": "fc57d028e13d5cf01fcc68224be25f75",
  "spanID": "298504b20bd9459a"
}{
  "time": "2024-10-16T21:42:17.014099379+05:30",
  "level": "DEBUG",
  "source": {
    "function": "go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace",
    "file": "/go/src/Flights-GI-Backend/opentelemetry-go-instrumentation/internal/pkg/opentelemetry/controller.go",
    "line": 75
  },
  "msg": "getting tracer",
  "name": "",
  "version": "",
  "schema": "https://opentelemetry.io/schemas/1.26.0"
}

@damemi
Copy link
Contributor

damemi commented Oct 16, 2024

@shravanv1 do you have the full logs from the agent from when it starts up? Not just the logs around the spans. I'm looking for a message that would indicate if your host supports context propagation

@shravanv1
Copy link
Author

shravanv1 commented Oct 16, 2024

@damemi the thing is when I am using this ctx := icontext.NewContext() in this method func copyGinContext(c *gin.Context) icontext.IContext to create the icontext I thing there it is breaking , but when I am using ctx := icontext.NewContextFromParent(c.Request.Context()) context propgation is working fine. But I can't use this NewContextFromParent bcz I am using icontext later for async logging and once the request complete this got cancelled.

adding @MrAlias also. As this is a bit urgent I am planing to go live with auto instrumentation on prod.

PFA start up logs
otel.json

@RonFed
Copy link
Contributor

RonFed commented Oct 19, 2024

@shravanv1 you pointed out the problem correctly. Currently, for in-process context propagation to work the context of the outgoing request must be a descendant of the one from the incoming request.
ctx := icontext.NewContext() breaks that.

@ahawtho
Copy link

ahawtho commented Nov 19, 2024

Environment
Language: Go
Go Version: 1.23.1
OpenTelemetry Go Auto-Instrumentation Version: v0.17.0-alpha
OS: Linux (Amazon Linux 2023)
Kernel Version: 6.1.112-124.190.amzn2023.x86_64
Tracing Backend: OTLP

Relevant Instrumentation values:

spec:
  exporter:
    endpoint: http://datadog.datadog.svc.cluster.local:4318
  go:
    env:
      - name: OTEL_SERVICE_NAME
        value: empty
      - name: OTEL_EXPORTER_OTLP_PROTOCOL
        value: grpc
      - name: OTEL_LOG_LEVEL
        value: debug
      - name: OTEL_GO_AUTO_GLOBAL
        value: 'true'
      - name: OTEL_EXPORTER_OTLP_ENDPOINT
        value: http://datadog.datadog.svc.cluster.local:4318
      - name: OTEL_PROPAGATORS
        value: tracecontext,baggage
  propagators:
    - tracecontext
    - baggage
  sampler:
    argument: '1'
    type: parentbased_always_on

I'm seeing a similar issue as reported here, but the service we're testing on is a simple "empty" service using our conventions. We use this service for testing and doing exploration like this. Our services generally use gin for routing.

I added two endpoints to this service, /api/v1/testing/source and /api/v1/testing/dest. External requests are made to .../source, which invokes .../dest. The response from .../source includes the relevant headers seen in both requests.

The service runs in a cluster with Istio, which has datadog tracing enabled. The version we run (1.20) doesn't yet support otel natively, however, the datadog support appears to be including the traceparent/tracestate headers from the tracecontext spec. I ran a test without auto-instrumentation first. I added code to manually propagate only the tracecontext headers, and I still saw consistent results in the .../dest endpoint in traceIds and spanIds between tracecontext and datadog headers.

Then we enabled auto-instrumentation using the otel operator. Whether with or without manually propagating the tracecontext headers, the headers in .../dest no longer have the same traceId as the ones in .../source. I can see that the instrumentation is being applied successfully, but there are a few surprises.

  1. From the probe source, I can see there is an effort to extract the state from the headers, but when I manually use propagation.TraceContext{}.Inject from the request.Context() into a MapCarrier, there doesn't seem to be anything propagated.
  2. When I manually extract headers using propagation.TraceContext{}.Extract(request.Context(), propagation.HeaderCarrier(request.Header)) and then Inject() back to a MapCarrier, I can see the headers, so I know that they are really there.
  3. In the debug output from the opentelemetry-auto-instrumentation sidecar, I don't see the same contents as shown in @shravanv1 's output; the Tracer output seems to be empty/null:
{"time":"2024-11-19T15:40:33.082592037Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/opentelemetry.(*Controller).Trace","file":"/app/internal/pkg/opentelemetry/controller.go","line":54},"msg":"handling span","tracer":{"Tracer":null},"span":{}}

Here's the details of the instrumentation:

{"time":"2024-11-19T15:40:30.068927041Z","level":"INFO","source":{"function":"main.main","file":"/app/cli/main.go","line":103},"msg":"building OpenTelemetry Go instrumentation ...","globalImpl":false,"version":{"Release":"v0.17.0-alpha","Revision":"unknown","Go":{"Version":"go1.23.2","OS":"linux","Arch":"amd64"}}}
{"time":"2024-11-19T15:40:32.069864085Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).DiscoverProcessID","file":"/app/internal/pkg/process/discover.go","line":64},"msg":"found process","pid":68}
{"time":"2024-11-19T15:40:32.079370322Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"net/http.Header.writeSubset","Offset":3813664,"ReturnOffsets":[3814151,3814963]}}
{"time":"2024-11-19T15:40:32.079493065Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"net/http.(*Transport).roundTrip","Offset":3960896,"ReturnOffsets":[3961456,3961674,3962183,3962450,3962579,3963129,3963174,3963292,3963410,3964331,3964464,3964599,3964723,3964802,3965035,3965162]}}
{"time":"2024-11-19T15:40:32.079501236Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"net/http.serverHandler.ServeHTTP","Offset":4027424,"ReturnOffsets":[4027571]}}
{"time":"2024-11-19T15:40:32.079504575Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"go.opentelemetry.io/otel/internal/global.(*tracer).Start","Offset":4264288,"ReturnOffsets":[4264769,4264970]}}
{"time":"2024-11-19T15:40:32.079507755Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"go.opentelemetry.io/otel/internal/global.(*nonRecordingSpan).End","Offset":4266912,"ReturnOffsets":[4266931]}}
{"time":"2024-11-19T15:40:32.079510688Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"go.opentelemetry.io/otel/internal/global.(*nonRecordingSpan).SetAttributes","Offset":4267104,"ReturnOffsets":[4267123]}}
{"time":"2024-11-19T15:40:32.079513615Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"go.opentelemetry.io/otel/internal/global.(*nonRecordingSpan).SetName","Offset":4267232,"ReturnOffsets":[4267251]}}
{"time":"2024-11-19T15:40:32.079516291Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"go.opentelemetry.io/otel/internal/global.(*nonRecordingSpan).SetStatus","Offset":4267296,"ReturnOffsets":[4267315]}}
{"time":"2024-11-19T15:40:32.079518847Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"database/sql.(*DB).execDC","Offset":6543168,"ReturnOffsets":[6544017,6544314,6544384,6544549,6544871]}}
{"time":"2024-11-19T15:40:32.079534764Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"database/sql.(*DB).queryDC","Offset":6546848,"ReturnOffsets":[6547947,6548016,6548064,6548256,6548304]}}
{"time":"2024-11-19T15:40:32.07953752Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"google.golang.org/grpc/internal/transport.(*loopyWriter).headerHandler","Offset":7375040,"ReturnOffsets":[7375337,7375376,7375402,7375436,7375568,7375709]}}
{"time":"2024-11-19T15:40:32.079540635Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"google.golang.org/grpc/internal/transport.(*http2Client).NewStream","Offset":7429824,"ReturnOffsets":[7430318,7431383,7431460,7431571,7431676,7432145,7432262]}}
{"time":"2024-11-19T15:40:32.079545113Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"google.golang.org/grpc/internal/transport.(*http2Server).operateHeaders","Offset":7467648,"ReturnOffsets":[7468557,7468684,7473104,7474336,7474538,7475228,7476081,7477306,7477626,7477754,7478478,7478525]}}
{"time":"2024-11-19T15:40:32.079548418Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"google.golang.org/grpc/internal/transport.(*http2Server).WriteStatus","Offset":7491520,"ReturnOffsets":[7494020,7494127,7494186,7494238,7494619,7494870]}}
{"time":"2024-11-19T15:40:32.079551173Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"google.golang.org/grpc.(*ClientConn).Invoke","Offset":7649472,"ReturnOffsets":[7649997,7650055]}}
{"time":"2024-11-19T15:40:32.079554291Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze","file":"/app/internal/pkg/process/analyze.go","line":95},"msg":"found function","function_name":{"Name":"google.golang.org/grpc.(*Server).handleStream","Offset":7747712,"ReturnOffsets":[7749263,7751216,7751281,7751362,7751443]}}
{"time":"2024-11-19T15:40:32.079679733Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.Allocate","file":"/app/internal/pkg/process/allocate.go","line":37},"msg":"Requesting memory allocation","size":65536,"page size":4096,"cpu count":2}
{"time":"2024-11-19T15:40:32.081837911Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.NewTracedProgram","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":125},"msg":"attach successfully","tid":68}
{"time":"2024-11-19T15:40:32.082051934Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.NewTracedProgram","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":125},"msg":"attach successfully","tid":77}
{"time":"2024-11-19T15:40:32.082361479Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.NewTracedProgram","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":125},"msg":"attach successfully","tid":78}
{"time":"2024-11-19T15:40:32.082436066Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.NewTracedProgram","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":125},"msg":"attach successfully","tid":79}
{"time":"2024-11-19T15:40:32.082591952Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.NewTracedProgram","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":125},"msg":"attach successfully","tid":80}
{"time":"2024-11-19T15:40:32.082874439Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.NewTracedProgram","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":125},"msg":"attach successfully","tid":81}
{"time":"2024-11-19T15:40:32.083165752Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.remoteAllocate","file":"/app/internal/pkg/process/allocate.go","line":80},"msg":"Set memlock on process successfully"}
{"time":"2024-11-19T15:40:32.083519797Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.(*TracedProgram).Madvise","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":237},"msg":"Detected linux kernel version","version":"6.1.0"}
{"time":"2024-11-19T15:40:32.087963876Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process/ptrace.(*TracedProgram).Mlock","file":"/app/internal/pkg/process/ptrace/ptrace_linux.go","line":249},"msg":"mlock ret","ret":0}
{"time":"2024-11-19T15:40:32.088003231Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.remoteAllocate.func1","file":"/app/internal/pkg/process/allocate.go","line":70},"msg":"Detaching from process","pid":68}
{"time":"2024-11-19T15:40:32.088049923Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/process.Allocate","file":"/app/internal/pkg/process/allocate.go","line":48},"msg":"mmaped remote memory","start_addr":"0x7f548f21a000","end_addr":"0x7f548f22a000"}
{"time":"2024-11-19T15:40:32.088069199Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto.NewInstrumentation","file":"/app/instrumentation.go","line":113},"msg":"target process analysis completed","pid":68,"go_version":"1.23.1","dependencies":{"filippo.io/edwards25519":"1.1.0",
[... internal packages elided...]
,"github.com/aws/aws-sdk-go":"1.55.5","github.com/beorn7/perks":"1.0.1","github.com/cespare/xxhash/v2":"2.3.0","github.com/creasty/defaults":"1.8.0","github.com/fatih/structtag":"1.2.0","github.com/fsnotify/fsnotify":"1.8.0","github.com/gabriel-vasile/mimetype":"1.4.6","github.com/gin-contrib/sse":"0.1.0","github.com/gin-gonic/gin":"1.10.0","github.com/go-logr/logr":"1.4.2","github.com/go-logr/stdr":"1.2.2","github.com/go-playground/locales":"0.14.1","github.com/go-playground/universal-translator":"0.18.1","github.com/go-playground/validator/v10":"10.22.1","github.com/go-sql-driver/mysql":"1.8.1","github.com/gofrs/uuid/v5":"5.3.0","github.com/golang-jwt/jwt/v5":"5.2.1","github.com/hashicorp/hcl":"1.0.0","github.com/jinzhu/gorm":"1.9.16","github.com/jinzhu/inflection":"1.0.0","github.com/jmespath/go-jmespath":"0.4.0","github.com/klauspost/compress":"1.17.11","github.com/leodido/go-urn":"1.4.0","github.com/lib/pq":"1.10.9","github.com/magiconair/properties":"1.8.7","github.com/mattn/go-isatty":"0.0.20","github.com/mitchellh/mapstructure":"1.5.0","github.com/munnerz/goautoneg":"0.0.0-20191010083416-a7dc8b61c822","github.com/newrelic/go-agent/v3":"3.35.1","github.com/newrelic/go-agent/v3/integrations/nrawssdk-v1":"1.1.2","github.com/newrelic/go-agent/v3/integrations/nrgin":"1.3.2","github.com/newrelic/go-agent/v3/integrations/nrlogrus":"1.0.1","github.com/pelletier/go-toml/v2":"2.2.3","github.com/pkg/errors":"0.9.1","github.com/prometheus/client_golang":"1.20.5","github.com/prometheus/client_model":"0.6.1","github.com/prometheus/common":"0.60.1","github.com/prometheus/procfs":"0.15.1","github.com/rollbar/rollbar-go":"1.4.5","github.com/sirupsen/logrus":"1.9.3","github.com/spf13/afero":"1.11.0","github.com/spf13/cast":"1.7.0","github.com/spf13/cobra":"1.8.1","github.com/spf13/jwalterweatherman":"1.1.0","github.com/spf13/pflag":"1.0.5","github.com/spf13/viper":"1.16.0","github.com/subosito/gotenv":"1.6.0","github.com/ugorji/go/codec":"1.2.12","go.opentelemetry.io/otel":"1.31.0","go.opentelemetry.io/otel/metric":"1.31.0","go.opentelemetry.io/otel/trace":"1.31.0","go.uber.org/multierr":"1.11.0","go.uber.org/zap":"1.27.0","golang.org/x/crypto":"0.28.0","golang.org/x/net":"0.30.0","golang.org/x/sys":"0.27.0","golang.org/x/text":"0.19.0","google.golang.org/genproto/googleapis/rpc":"0.0.0-20241007155032-5fefd90f89a9","google.golang.org/grpc":"1.67.0","google.golang.org/protobuf":"1.35.2","gopkg.in/ini.v1":"1.67.0","gopkg.in/yaml.v3":"3.0.1","std":"1.23.1"},"total_functions_found":16}
{"time":"2024-11-19T15:40:32.088579313Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).FilterUnusedProbes","file":"/app/internal/pkg/instrumentation/manager.go","line":152},"msg":"no functions found for probe, removing","name":{"SpanKind":4,"InstrumentedPkg":"github.com/segmentio/kafka-go"}}
{"time":"2024-11-19T15:40:32.088654515Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).FilterUnusedProbes","file":"/app/internal/pkg/instrumentation/manager.go","line":152},"msg":"no functions found for probe, removing","name":{"SpanKind":5,"InstrumentedPkg":"github.com/segmentio/kafka-go"}}
{"time":"2024-11-19T15:40:32.08866905Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).FilterUnusedProbes","file":"/app/internal/pkg/instrumentation/manager.go","line":152},"msg":"no functions found for probe, removing","name":{"SpanKind":3,"InstrumentedPkg":"go.opentelemetry.io/auto"}}
{"time":"2024-11-19T15:40:32.088702684Z","level":"INFO","source":{"function":"main.main","file":"/app/cli/main.go","line":134},"msg":"starting instrumentation..."}
{"time":"2024-11-19T15:40:32.090725281Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).mount","file":"/app/internal/pkg/instrumentation/manager.go","line":347},"msg":"Mounting bpffs","allocations_details":{"StartAddr":140001155325952,"EndAddr":140001155391488,"NumCPU":2}}
{"time":"2024-11-19T15:40:32.091024754Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":332},"msg":"loading probe","name":{"SpanKind":3,"InstrumentedPkg":"go.opentelemetry.io/otel/internal/global"}}
{"time":"2024-11-19T15:40:32.439601415Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":332},"msg":"loading probe","name":{"SpanKind":2,"InstrumentedPkg":"google.golang.org/grpc"}}
{"time":"2024-11-19T15:40:32.499005811Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":332},"msg":"loading probe","name":{"SpanKind":2,"InstrumentedPkg":"net/http"}}
{"time":"2024-11-19T15:40:32.52553641Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":332},"msg":"loading probe","name":{"SpanKind":3,"InstrumentedPkg":"database/sql"}}
{"time":"2024-11-19T15:40:32.579063529Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":332},"msg":"loading probe","name":{"SpanKind":3,"InstrumentedPkg":"google.golang.org/grpc"}}
{"time":"2024-11-19T15:40:32.600712097Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":332},"msg":"loading probe","name":{"SpanKind":3,"InstrumentedPkg":"net/http"}}
{"time":"2024-11-19T15:40:32.606212978Z","level":"DEBUG","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load","file":"/app/internal/pkg/instrumentation/manager.go","line":341},"msg":"loaded probes to memory","total_probes":6}
{"time":"2024-11-19T15:40:32.610532742Z","level":"INFO","source":{"function":"main.main.func3","file":"/app/cli/main.go","line":130},"msg":"instrumentation loaded successfully"}
{"time":"2024-11-19T15:40:32.610740425Z","level":"INFO","source":{"function":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).ConfigLoop","file":"/app/internal/pkg/instrumentation/manager.go","line":241},"msg":"Configuration provider closed, configuration updates will no longer be received"}

@shravanv1
Copy link
Author

@RonFed can you pls suggest this why I am getting data like any idea? also I am fixed that context propagation.
Flights-Cbackend , Flights-go-mmt-svcs is go services
Flights-b2c-search is java services
Screenshot 2024-11-21 at 4 14 18 PM

@RonFed
Copy link
Contributor

RonFed commented Nov 21, 2024

@shravanv1 If I understood the image correctly, the context is propagated from Go to Java and from Java to Go (since they all have the same trace ID). However, the Flights-b2c-search which is Java, does not fill any parent span - if that is the case, maybe the problem is in the Java instrumentation?

@RonFed
Copy link
Contributor

RonFed commented Nov 21, 2024

@ahawtho Do you know if the go context object is being propagated from the source to the dest endpoint? i.e the request.Context for the outgoing request must be a descendant of the one used for the incoming request - for the span context to propagate correctly.

@shravanv1
Copy link
Author

@RonFed no I doubt somthing wrong here might be some of my span got dropped or it is not being genrated properly.

As you can see the attached pic here there is there are one api 'dpt' is being called from go-mmt-svcs in one trace I am getting the two tarce in one I am not getting one tarce.
Screenshot 2024-11-22 at 5 28 20 PM
Screenshot 2024-11-22 at 5 28 42 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants