Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mysql2 seems to put the first query in it's own span when using a pool #1050

Open
mbrevda opened this issue Jun 7, 2022 · 8 comments
Open
Assignees
Labels
bug Something isn't working pkg:instrumentation-mysql2 priority:p2 Bugs and spec inconsistencies which cause telemetry to be incomplete or incorrect

Comments

@mbrevda
Copy link

mbrevda commented Jun 7, 2022

What version of OpenTelemetry are you using?

"@opentelemetry/instrumentation-dns": "0.28.0",
"@opentelemetry/instrumentation-express": "0.29.0",
"@opentelemetry/instrumentation-fs": "0.3.0",
"@opentelemetry/instrumentation-grpc": "0.29.2",
"@opentelemetry/instrumentation-http": "0.29.2",
"@opentelemetry/instrumentation-mysql2": "0.30.0",

What version of Node are you using?

v16.12.0

What did you do?

My Config
import opentelemetry from '@opentelemetry/sdk-node'
import {diag, DiagConsoleLogger, DiagLogLevel} from '@opentelemetry/api'
import {Resource} from '@opentelemetry/resources'
import {SemanticResourceAttributes} from '@opentelemetry/semantic-conventions'
import {DnsInstrumentation} from '@opentelemetry/instrumentation-dns'
import {MySQL2Instrumentation} from '@opentelemetry/instrumentation-mysql2'
import {ExpressInstrumentation} from '@opentelemetry/instrumentation-express'
import {HttpInstrumentation} from '@opentelemetry/instrumentation-http'
import {GrpcInstrumentation} from '@opentelemetry/instrumentation-grpc'
import {WinstonInstrumentation} from '@opentelemetry/instrumentation-winston'
// import {FsInstrumentation} from '@opentelemetry/instrumentation-fs'
import {TraceExporter} from '@google-cloud/opentelemetry-cloud-trace-exporter'

// For troubleshooting, set the log level to DiagLogLevel.DEBUG
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.INFO)

const traceExporter = process.env.K_CONFIGURATION
  ? new TraceExporter()
  : new opentelemetry.tracing.ConsoleSpanExporter()

const sdk = new opentelemetry.NodeSDK({
  resource: new Resource({
    [SemanticResourceAttributes.SERVICE_NAME]: 'segops-api'
  }),
  traceExporter,
  instrumentations: [
    // new FsInstrumentation({createHook: console.log}),
    new DnsInstrumentation(),
    new MySQL2Instrumentation(),
    new ExpressInstrumentation(),
    new GrpcInstrumentation(),
    new HttpInstrumentation(),
    new WinstonInstrumentation()
  ]
})

sdk.start()

process.on('SIGTERM', () => {
  sdk.shutdown().finally(() => process.exit(0))
})

What did you expect to see?

I expected all traces to have the same traceId

What did you see instead?

I noticed that the first MySQL query trace uses it's own id. FWIW, were using pools

Additional context

First call
debug /v1/foo  {     // <--- from logger
  "timestamp": "2022-06-07T12:33:42.320Z",
  "trace_id": "7f21606ebea9e9def4ebc66a35a76eff", // <--- ✅
  "span_id": "66e861f32ea7b8ee",
  "trace_flags": "01"
}
{
  traceId: '50177912b0a568f6c90e685395e78bd8',  // <--- 🤔
  parentId: undefined,
  name: 'SELECT',
  id: '42e1e1079bc997d9',
  kind: 2,
  timestamp: 1654605222339837,
  duration: 2799,
  attributes: {
    'db.system': 'mysql',
    'net.peer.name': 'localhost',
    'net.peer.port': 3306,
    'db.connection_string': 'jdbc:mysql://localhost:3306/myDb',
    'db.name': 'myDb',
    'db.user': 'root',
    'db.statement': "SELECT * FROM ..."
  },
  status: { code: 0 },
  events: [],
  links: []
}
{
  traceId: '7f21606ebea9e9def4ebc66a35a76eff', // <--- ✅
  parentId: undefined,
  name: 'HTTP GET',
  id: '66e861f32ea7b8ee',
  kind: 1,
  timestamp: 1654605222309705,
  duration: 34959,
  attributes: {
    'http.url': 'http://127.0.0.1:3000/v1/foo',
    'http.host': '127.0.0.1:3000',
    'net.host.name': '127.0.0.1',
    'http.method': 'GET',
    'http.target': '/v1/foo',
    'http.flavor': '1.1',
    'net.transport': 'ip_tcp',
    'net.host.ip': '::ffff:127.0.0.1',
    'net.host.port': 3000,
    'net.peer.ip': '::ffff:127.0.0.1',
    'net.peer.port': 52724,
    'http.status_code': 200,
    'http.status_text': 'OK'
  },
  status: { code: 0 },
  events: [],
  links: []
}
Second call
debug /v1/foo  {
  "timestamp": "2022-06-07T12:37:57.505Z",
  "trace_id": "2d78fcc700426218f758e3b4153f65c4", // <--- ✅
  "span_id": "6f80f4662161f7b6",
  "trace_flags": "01"
}
{
  traceId: '2d78fcc700426218f758e3b4153f65c4', // <--- ✅
  parentId: '6f80f4662161f7b6',
  name: 'SELECT',
  id: 'cd0f7e5eb95562c2',
  kind: 2,
  timestamp: 1654605477492792,
  duration: 7623,
  attributes: {
    'db.system': 'mysql',
    'net.peer.name': 'localhost',
    'net.peer.port': 3306,
    'db.connection_string': 'jdbc:mysql://localhost:3306/myDb',
    'db.name': 'myDb',
    'db.user': 'root',
    'db.statement': "SELECT * FROM ..."
  },
  status: { code: 0 },
  events: [],
  links: []
}
{
  traceId: '2d78fcc700426218f758e3b4153f65c4', // <--- ✅
  parentId: undefined,
  name: 'HTTP GET',
  id: '6f80f4662161f7b6',
  kind: 1,
  timestamp: 1654605477480136,
  duration: 21166,
  attributes: {
    'http.url': 'http://127.0.0.1:3000/v1/foo',
    'http.host': '127.0.0.1:3000',
    'net.host.name': '127.0.0.1',
    'http.method': 'GET',
    'http.target': '/v1/foo',
    'http.flavor': '1.1',
    'net.transport': 'ip_tcp',
    'net.host.ip': '::ffff:127.0.0.1',
    'net.host.port': 3000,
    'net.peer.ip': '::ffff:127.0.0.1',
    'net.peer.port': 53752,
    'http.status_code': 200,
    'http.status_text': 'OK'
  },
  status: { code: 0 },
  events: [],
  links: []
}
<details>

@mbrevda mbrevda added the bug Something isn't working label Jun 7, 2022
@github-actions
Copy link
Contributor

github-actions bot commented Aug 8, 2022

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days.

@github-actions github-actions bot added the stale label Aug 8, 2022
@mbrevda
Copy link
Author

mbrevda commented Aug 8, 2022

unstale

@github-actions github-actions bot removed the stale label Aug 15, 2022
@dyladan dyladan self-assigned this Sep 14, 2022
@dyladan dyladan added the priority:p2 Bugs and spec inconsistencies which cause telemetry to be incomplete or incorrect label Sep 14, 2022
@github-actions
Copy link
Contributor

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days.

@github-actions github-actions bot added the stale label Feb 27, 2023
@mbrevda
Copy link
Author

mbrevda commented Feb 27, 2023

Really?

@github-actions github-actions bot removed the stale label Mar 6, 2023
@github-actions
Copy link
Contributor

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days.

@github-actions github-actions bot added the stale label May 22, 2023
@mbrevda
Copy link
Author

mbrevda commented May 22, 2023

.

@github-actions github-actions bot removed the stale label May 29, 2023
@github-actions
Copy link
Contributor

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days.

@github-actions github-actions bot added the stale label Aug 14, 2023
@mbrevda
Copy link
Author

mbrevda commented Aug 15, 2023

🫥

@github-actions github-actions bot removed the stale label Aug 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working pkg:instrumentation-mysql2 priority:p2 Bugs and spec inconsistencies which cause telemetry to be incomplete or incorrect
Projects
None yet
Development

No branches or pull requests

3 participants