Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Newline character in AWS credentials produces cryptic error #5994

Closed
kousik93 opened this issue Jan 12, 2021 · 2 comments · Fixed by #6107
Closed

Newline character in AWS credentials produces cryptic error #5994

kousik93 opened this issue Jan 12, 2021 · 2 comments · Fixed by #6107
Assignees
Labels
platform: kubernetes Anything `kubernetes` platform related sink: aws_kinesis_firehose Anything `aws_kinesis_firehose` sink related

Comments

@kousik93
Copy link

kousik93 commented Jan 12, 2021

Hi there, I am running vector as a daemonset in Kubernetes. I am trying to send 'kubernetes_logs' source to Kinesis Firehose sink. After setting up the configuration and credentials, vector throws errors at startup.

Vector Startup Logs with errors

Jan 12 02:21:49.458  INFO vector::app: Loading configs. path=[("/etc/vector/managed.toml", None)]
Jan 12 02:21:49.461  INFO vector::sources::kubernetes_logs: Obtained Kubernetes Node name to collect logs for (self). self_node_name="ip-10-93-143-36.ec2.internal"
Jan 12 02:21:49.528  INFO vector::topology: Running healthchecks.
Jan 12 02:21:49.530  INFO vector::topology: Starting source. name="kubernetes_logs"
Jan 12 02:21:49.530  INFO vector::topology: Starting sink. name="blackhole"
Jan 12 02:21:49.530  INFO vector::topology: Starting sink. name="amazon_kinesis_firehose"
Jan 12 02:21:49.530  INFO vector: Vector has started. version="0.11.1" git_version="v0.11.1" released="Thu, 17 Dec 2020 17:09:17 +0000" arch="x86_64"
Jan 12 02:21:49.530  INFO vector::app: API is disabled, enable by setting `api.enabled` to `true` and use commands like `vector top`.
Jan 12 02:21:49.537 ERROR vector::topology::builder: Healthcheck: Failed Reason. error=DescribeDeliveryStream failed: error parsing header value: failed to parse header value
Jan 12 02:21:49.542  INFO source{component_kind="source" component_name=kubernetes_logs component_type=kubernetes_logs}:file_server: file_source::checkpointer: Loaded checkpoint data.
Jan 12 02:21:49.596  WARN sink{component_kind="sink" component_name=amazon_kinesis_firehose component_type=aws_kinesis_firehose}:request{request_id=0}: vector::sinks::util::retries: Retrying after error. error=HttpDispatch(HttpDispatchError { message: "error parsing header value: failed to parse header value" })
Jan 12 02:21:49.603  WARN sink{component_kind="sink" component_name=amazon_kinesis_firehose component_type=aws_kinesis_firehose}:request{request_id=1}: vector::sinks::util::retries: Retrying after error. error=HttpDispatch(HttpDispatchError { message: "error parsing header value: failed to parse header value" })

Vector Full config

----
data_dir = "/vector-data-dir"
[log_schema]
  host_key = "host"
  message_key = "message"
  source_type_key = "source_type"
  timestamp_key = "timestamp"

[sources.kubernetes_logs]
  type = "kubernetes_logs"
[sinks.amazon_kinesis_firehose]
  type = "aws_kinesis_firehose"
  inputs = ["kubernetes_logs"]
  compression = "none" 
  region = "us-east-1" 
  stream_name = "<stream-name>" 
  encoding.codec = "json" # required
  encoding.timestamp_format = "rfc3339"

More Info
I have verified network connectivity. I have also verified that the credentials work. I used boto to write a small python script to interact with firehose (list, describe and write to stream works).

One thing i found is that, the python aws sdk did not accept None or an empty string as a value for ExclusiveStartDestinationId. However i see that in the code, None is being passed. Im wondering if this could be a cause of error and the response is not able to be parsed? (https://github.com/timberio/vector/blob/87b30293a0d8fa4ea2aeb2af385abc58d01a85d1/src/sinks/aws_kinesis_firehose.rs#L109)

I am very new to this tool and to Rust. Would really appreciate if someone can help/shed some light on this for me. Thanks in advance.

@kousik93
Copy link
Author

Update:
My issue seems to be related to rusoto/rusoto#1589
Had an extra newline in my secrets. Started working imm after fixing it.

This issue can be closed

@jamtur01 jamtur01 added platform: kubernetes Anything `kubernetes` platform related sink: aws_kinesis_firehose Anything `aws_kinesis_firehose` sink related labels Jan 12, 2021
@jamtur01 jamtur01 added this to the 2021-01-04 Xenomass Well milestone Jan 12, 2021
@jamtur01
Copy link
Contributor

@ktff Assigning to you - can we please submit a patch upstream at rusoto/rusoto#1589.

@jamtur01 jamtur01 reopened this Jan 12, 2021
@jamtur01 jamtur01 changed the title Issues with Sending to Firehose Sink from Kubernetes Daemonset Newline character in AWS credentials produces cryptic error Jan 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform: kubernetes Anything `kubernetes` platform related sink: aws_kinesis_firehose Anything `aws_kinesis_firehose` sink related
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants