how to handle go import absolute paths and github forks?
how to handle go import absolute paths and github forks?
There are plenty of questions around this, including why you shouldn't use import "./my/path"
and why it only works because some legacy go code requires it.
import "./my/path"
If this is correct, how do you handle encapsulation of a project and by extension github forks? In every other lang, I can do a github fork of a project, or git clone, and everything is encapsulated there. How do I get the same behaviour out of a go project?
Simple example using the go "hello world" example.
hello.go
package main
import ("fmt"
"github.com/golang/examples/stringutil")
func main()
fmt.Printf(stringutil.Reverse("hello, world")+"n")
The above works great. But if I want to use my own stringutil which is in a subdirectory and will compile to a single binary, I still need the complete path:
package main
import ("fmt"
"github.com/myrepo/examples/util/stringutil")
func main()
fmt.Printf(stringutil.Reverse("hello, world")+"n")
Now, if someone copies or forks my repo, it has a direct dependency on "github.com/myrepo/", even though this is used entirely internally!
What if there are 20 different files that import utils/
? I need to change each one each time someone forks? That is a lot of extraneous changes and a nonsensical git commit.
utils/
What am I missing here? Why are relative paths such a bad thing? How do I fork a project that refers to its own subsidiary directories (and their packages) without changing dozens of files?
Some has changed. Go modules is now supported (as of 1.11) if you are outside of
GOPATH
or set the env var GO111MODULE=on
. It still doesn't solve the fundamental problem, and is one of my major gripes with go. Ignoring languages, I always have strongly felt that node packaging got it right: arbitrary name inside files in the package, and a "mapping file" (package.json
or whatever) to map to absolute locations. Oh well.– deitch
Nov 15 '18 at 9:00
GOPATH
GO111MODULE=on
package.json
I looked briefly at Go modules until I saw the word "experimental". The solution seems to be to clone into repos that are named the same as the upstream repo, while pushing to origin (fork) for private changes. It's all pretty lame.
– Sentinel
Nov 15 '18 at 10:00
Golang module structure is pretty lame. Its adoption is despite its weaknesses. :-(
– deitch
Nov 15 '18 at 13:01
1 Answer
1
As for the reasoning behind not allowing relative imports, you can read this discussion for some perspective: https://groups.google.com/forum/#!msg/golang-nuts/n9d8RzVnadk/07f9RDlwLsYJ
Personally I'd rather have them enabled, at least for internal imports, exactly for the reason you are describing.
Now, how to deal with the situation?
If your fork is just a small fix from another project that will probably be accepted as a PR soon - just manually edit the git remotes for it to refer to your own git repo and not the original one. If you're using a vendoring solution like godep, it will work smoothly since saving it will just vendor your forked code, and go get
is never used directly.
go get
If your fork is a big change and you intend to remain forked, rewrite all the import paths. You can automate it with sed
or you can use gofmt -r
that supports rewriting of the code being formatted.
sed
gofmt -r
[EDIT] I also found this tool which is designed to help with that situation: https://github.com/rogpeppe/govers
I've done both 1 and 2 - when I just had a small bugfix to some library I just changed the remote and verndored it. When I actually forked a library without intent of merging my changes back, I changed all the import paths and continued to use my repo only.
I can also think of an addition to vendoring tools allowing automation of this stuff, but I don't think any of them support it currently.
So go's idea of a package is absolute. Period. You can put directories above/below each other, but if one thing requires another, and is not in the same package, it is an absolute import. Which means either don't build software composed of packages that build together and aren't absolute, or accept lots of wasteful rewrites. I don't get it. Weren't they thinking of this when they designed the package namespace?
– deitch
Aug 23 '15 at 13:42
Google's philosophy on this is that they vendor everything into one source tree and that's it. So it kinda makes this whole idea of forking dependencies redundant, as every dependency is already forked in a sense.
– Not_a_Golfer
Aug 23 '15 at 13:45
"they vendor everything into one source tree ... every dependency is already forked." So the way the rest of the world writes software - larger package, sub files and folders that allow us to organize and build things together into a single release, and be self-encapsulated, and lets things like github forking and subversion check out, etc. work - Google and therefore go just don't work that way?
– deitch
Aug 23 '15 at 14:20
@deitch I'm not a googler, this is just based on what I've read from the go authors, but in essence - yes. Go doesn't dictate how you should work, and this problem around forks doesn't stem from the way packages are organized, but from the (wrong) assumption that
go get
is a good enough dependency manager, and that import path == git url.– Not_a_Golfer
Aug 23 '15 at 14:38
go get
In other words, a just fine philosophy for internal corporate project management, not a very good one for collaborative open source management. That's a pity. I really like many of the things go does. This isn't one of them
– deitch
Aug 23 '15 at 15:06
Thanks for contributing an answer to Stack Overflow!
But avoid …
To learn more, see our tips on writing great answers.
Required, but never shown
Required, but never shown
By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.
Just hit the same discovery as a newcomer to golang. Any idea if something has changed in 2018?
– Sentinel
Nov 14 '18 at 21:06