Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Download tools from API server #608

Merged
merged 8 commits into from
Aug 28, 2014

Conversation

axw
Copy link
Contributor

@axw axw commented Aug 26, 2014

We will be eliminating the use of provider storage for tools. This is a
step in that direction: hiding the use of provider storage behind an HTTP
API provided by the API server for fetching tools. Later we will change the
backend to fetch the tools from GridFS, when that work has been
implemented.

Now that we are serving tools in the API server, we must consider the
availability of the API server at the time a machine is provisioned. The
aria2 package is an alternative to curl that is capable of failing over to
additional sources and optionally downloading chunks in parallel.
Note: curl can be built with support for Metalink, which would have been a
more conservative option. Unfortunately the version of curl packaged in
Ubuntu does not have Metalink support enabled.

The local provider always has tools available locally, and user-data for
lxc and kvm does not have the size limitations as in cloud providers. Thus,
we can safely serialise the tools tarball into user-data in the local
provider.

client := utils.GetHTTPClient(verify)
resp, err := client.Get(tools.URL)
if err != nil {
h.sendError(w, http.StatusBadRequest, fmt.Sprintf("failed to get %q: %v", err.Error()))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you forgot an argument? Sprintf needs 2, (%q and %v) but is given 1

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, fixed.

@ericsnowcurrently
Copy link
Contributor

Would it make sense to make use of utils/filestorage (and the implementation in juju/state/backups.go)? It's the package I added expressly to hide the underlying storage implementation, which seems to me like the same thing you are trying to accomplish.

@axw
Copy link
Contributor Author

axw commented Aug 27, 2014

Would it make sense to make use of utils/filestorage (and the implementation in juju/state/backups.go)? It's the package I added expressly to hide the underlying storage implementation, which seems to me like the same thing you are trying to accomplish.

I don't think so. This PR is about hiding the fact that there's external storage altogether from the clients (i.e. cloud-init script). Right now, the cloud-init scripts go and fetch tools directly from provider storage (e.g. an S3 bucket on AWS).

The end goal is to have tools stored in GridFS, replicated across each of the state servers. We'll still need a way for the cloud-init script to fetch them out of there, hence this HTTP endpoint. Introducing it early means we can easily switch out the backend later.

@ericsnowcurrently
Copy link
Contributor

Makes sense.

@waigani
Copy link
Contributor

waigani commented Aug 27, 2014

Could you add 2014 to the copyright headers that don't have it please?

@waigani
Copy link
Contributor

waigani commented Aug 27, 2014

Thanks for updating the sendError func. @davecheney is my review mentor, so he'll need to take a look too.

@axw
Copy link
Contributor Author

axw commented Aug 27, 2014

Could you add 2014 to the copyright headers that don't have it please?

Copyright law (in the US at least) only requires the first year of publication. In Australia it's automatically given; not sure what it is in the UK. I'd prefer we just used US-style unless given some other guidance by IP legal.

@@ -194,7 +194,7 @@ func AddAptCommands(
c.SetAptUpgrade(addUpgradeScripts)
c.SetAptGetWrapper("eatmydata")

c.AddPackage("curl")
c.AddPackage("aria2")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use wget as i suggested to Ian a week ago? With the -c option it'll keep retrying until it gets the whole thing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's not the point of using aria2. The point is to support failing over to alternative URLs in case one of the API servers becomes unresponsive.

@davecheney
Copy link
Contributor

LGTM.

@axw axw force-pushed the cloudinit-download-tools-apiserver branch from f327a08 to a65707d Compare August 28, 2014 06:06
@axw
Copy link
Contributor Author

axw commented Aug 28, 2014

$$merge$$

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Status: merge request accepted. Url: http://juju-ci.vapour.ws:8080/job/github-merge-juju

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Build failed: Generating tarball failed
build url: http://juju-ci.vapour.ws:8080/job/github-merge-juju/455

axw added 6 commits August 28, 2014 15:14
We will be eliminating the use of provider
storage for tools. This is a step in that
direction: hiding the use of provider storage
behind an HTTP API provided by the API server
for fetching tools. Later we will change the
backend to fetch the tools from GridFS, when
that work has been implemented.
Now that we are serving tools in the API server, we must consider
the availability of the API server at the time a machine is
provisioned. The aria2 package is an alternative to curl that
is capable of failing over to additional sources and optionally
downloading chunks in parallel.

Note: curl can be built with support for Metalink, which would
have been a more conservative option. Unfortunately the version
of curl packaged in Ubuntu does not have Metalink support enabled.
The local provider always has tools available
locally, and user-data for lxc and kvm does
not have the size limitations as in cloud providers.
Thus, we can safely serialise the tools tarball
into user-data in the local provider.
Log errors in sendError functions, and
change signature to not return error.
@axw axw force-pushed the cloudinit-download-tools-apiserver branch from a65707d to d02f41e Compare August 28, 2014 07:17
@axw
Copy link
Contributor Author

axw commented Aug 28, 2014

$$rebased$$

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Status: merge request accepted. Url: http://juju-ci.vapour.ws:8080/job/github-merge-juju

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Build failed: Tests failed
build url: http://juju-ci.vapour.ws:8080/job/github-merge-juju/456

We have tests for prereq checking, we don't need
to do it everywhere.
@axw
Copy link
Contributor Author

axw commented Aug 28, 2014

$$!!$$

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Status: merge request accepted. Url: http://juju-ci.vapour.ws:8080/job/github-merge-juju

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Build failed: Tests failed
build url: http://juju-ci.vapour.ws:8080/job/github-merge-juju/457

@axw
Copy link
Contributor Author

axw commented Aug 28, 2014

$$patchedintherightplacethistime$$

@jujubot
Copy link
Collaborator

jujubot commented Aug 28, 2014

Status: merge request accepted. Url: http://juju-ci.vapour.ws:8080/job/github-merge-juju

jujubot added a commit that referenced this pull request Aug 28, 2014
Download tools from API server

We will be eliminating the use of provider storage for tools. This is a
step in that direction: hiding the use of provider storage behind an HTTP
API provided by the API server for fetching tools. Later we will change the
backend to fetch the tools from GridFS, when that work has been
implemented.

Now that we are serving tools in the API server, we must consider the
availability of the API server at the time a machine is provisioned. The
aria2 package is an alternative to curl that is capable of failing over to
additional sources and optionally downloading chunks in parallel.
Note: curl can be built with support for Metalink, which would have been a
more conservative option. Unfortunately the version of curl packaged in
Ubuntu does not have Metalink support enabled.

The local provider always has tools available locally, and user-data for
lxc and kvm does not have the size limitations as in cloud providers. Thus,
we can safely serialise the tools tarball into user-data in the local
provider.
@jujubot jujubot merged commit f8ff456 into juju:master Aug 28, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants