S3 asf fixes detected by Terraform test suite#7027
Conversation
localstack/services/s3/utils.py
Outdated
| ObjectCannedACL.public_read, | ||
| ObjectCannedACL.public_read_write, | ||
| # according to docs 'log-delivery-write' is also valid https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl | ||
| "log-delivery-write", # TODO add to parser? seems to be missing accross services though |
There was a problem hiding this comment.
We could add it with a patch I guess, weird it's not part of it.
thrau
left a comment
There was a problem hiding this comment.
LGTM! i don't have a good suggestion on how to get the s3 hack out of the parser right now, so i'd be OK with merging it for now with the FIXME. i'll let alex make the final call though.
localstack/aws/protocol/parser.py
Outdated
| if ( | ||
| self.service.service_name == "s3" | ||
| and uri_param_name == "Key" | ||
| and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") |
There was a problem hiding this comment.
base_url: "Like url but without the query string."
| and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") | |
| and request.base_url.endswith(f"{uri_params[uri_param_name]}/") |
localstack/aws/protocol/parser.py
Outdated
| # FIXME special case for s3 object-names (=key): trailing '/' are valid and need to be preserved | ||
| # however, the url-matcher removes it from the key | ||
| # we check the request.url to verify the name | ||
| if ( | ||
| self.service.service_name == "s3" | ||
| and uri_param_name == "Key" | ||
| and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") |
There was a problem hiding this comment.
predictably i'd prefer encapsulating this in the s3 parser somehow :D i don't have a good suggestion at the moment, will take a look after the review, maybe @alexrashed also has some ideas
There was a problem hiding this comment.
To be host, the service_name is something we desperately tried to avoid in these "generalized" parsers.
Since there is already an S3RequestParser, we could check this case and modifying the uri_params and afterwards calling the super method. Something like this:
class S3RequestParser(RestXMLRequestParser):
def _parse_shape(
self, request: HttpRequest, shape: Shape, node: Any, uri_params: Mapping[str, Any] = None
) -> Any:
if shape is not None and uri_params is not None \
and shape.serialization.get("location") == "uri" \
and shape.serialization.get("name") == "Key" \
and request.url.split("?")[0].endswith(f"{uri_params['Key']}/"):
uri_params = dict(uri_params)
uri_params["Key"] = uri_params["Key"] + "/"
return super()._parse_shape(request, shape, node, uri_params)Another option would be to externalize the location parsing to a separate function, but this doesn't seem to be worth the effort / complexity.
I'm happy to assist if I can be of service!
| if request.method == "HEAD": | ||
| # for head we have to keep the original content-length, but it will be re-calcualated when creating | ||
| # the final_response object | ||
| final_response.content_length = response.headers.get("Content-Length", 0) |
alexrashed
left a comment
There was a problem hiding this comment.
Wow! This is great! The S3 ASF migration really goes to the limits (and beyond) of ASF. 🚀
It would be nice if we could externalize the S3 specific edge case in the parser, but this also shouldn't be a blocker (I could also take care of that another time).
localstack/aws/protocol/parser.py
Outdated
| # FIXME special case for s3 object-names (=key): trailing '/' are valid and need to be preserved | ||
| # however, the url-matcher removes it from the key | ||
| # we check the request.url to verify the name | ||
| if ( | ||
| self.service.service_name == "s3" | ||
| and uri_param_name == "Key" | ||
| and request.url.split("?")[0].endswith(f"{uri_params[uri_param_name]}/") |
There was a problem hiding this comment.
To be host, the service_name is something we desperately tried to avoid in these "generalized" parsers.
Since there is already an S3RequestParser, we could check this case and modifying the uri_params and afterwards calling the super method. Something like this:
class S3RequestParser(RestXMLRequestParser):
def _parse_shape(
self, request: HttpRequest, shape: Shape, node: Any, uri_params: Mapping[str, Any] = None
) -> Any:
if shape is not None and uri_params is not None \
and shape.serialization.get("location") == "uri" \
and shape.serialization.get("name") == "Key" \
and request.url.split("?")[0].endswith(f"{uri_params['Key']}/"):
uri_params = dict(uri_params)
uri_params["Key"] = uri_params["Key"] + "/"
return super()._parse_shape(request, shape, node, uri_params)Another option would be to externalize the location parsing to a separate function, but this doesn't seem to be worth the effort / complexity.
I'm happy to assist if I can be of service!
…sposition' for parsing
|
Moved the specific s3-parsing (for the special case with trailing "/" in the key name) to the S3RequestParser as suggested by @alexrashed (thank you! 🙂) Removing the commit with the excluded CORS tests and rebasing with master now (to check that the tests still work with the old s3 provider), before merging. |
This PR contains several fixes for S3 ASF that were detected by the terraform test suite:
ReplaceKeyPrefixWithfor redirect-routing rule (in that case it should remove the theKeyPrefixEqualsfrom the original url and then do the redirect)content-lanugage,content-dispositionandcache-controlto be picked up by parserx-amz-server-side-encryption-bucket-key-enabledwhich is returned as camel-case by moto, but should be lowercase to be picked up by the parsercontent-lengthfor HEAD requests that are forwarded using theProxyfor vhost-style requests.delete-objects:=from the path (/<bucket>/?delete=but expects/<bucket>/?delete)delete-objectsquietmode fordelete-objectslog-delivery-write, which was missing from theBucketCannedACLTODO before merge: