2019-05-07 04:12:51 +03:00
|
|
|
// Copyright 2019 The Gitea Authors. All rights reserved.
|
|
|
|
// Copyright 2018 Jonas Franz. All rights reserved.
|
|
|
|
// Use of this source code is governed by a MIT-style
|
|
|
|
// license that can be found in the LICENSE file.
|
|
|
|
|
|
|
|
package migrations
|
|
|
|
|
|
|
|
import (
|
2019-12-17 07:16:54 +03:00
|
|
|
"context"
|
2022-04-27 02:24:06 +03:00
|
|
|
"errors"
|
2019-05-07 04:12:51 +03:00
|
|
|
"fmt"
|
|
|
|
"io"
|
|
|
|
"os"
|
|
|
|
"path/filepath"
|
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-17 20:08:35 +03:00
|
|
|
"strconv"
|
2019-05-07 04:12:51 +03:00
|
|
|
"strings"
|
|
|
|
"time"
|
|
|
|
|
|
|
|
"code.gitea.io/gitea/models"
|
2021-09-19 14:49:59 +03:00
|
|
|
"code.gitea.io/gitea/models/db"
|
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-17 20:08:35 +03:00
|
|
|
"code.gitea.io/gitea/models/foreignreference"
|
2022-03-31 12:20:39 +03:00
|
|
|
issues_model "code.gitea.io/gitea/models/issues"
|
2021-11-19 16:39:57 +03:00
|
|
|
repo_model "code.gitea.io/gitea/models/repo"
|
2021-11-24 12:49:20 +03:00
|
|
|
user_model "code.gitea.io/gitea/models/user"
|
2019-05-07 04:12:51 +03:00
|
|
|
"code.gitea.io/gitea/modules/git"
|
|
|
|
"code.gitea.io/gitea/modules/log"
|
2021-11-16 18:25:33 +03:00
|
|
|
base "code.gitea.io/gitea/modules/migration"
|
2020-01-12 15:11:17 +03:00
|
|
|
repo_module "code.gitea.io/gitea/modules/repository"
|
2019-05-07 04:12:51 +03:00
|
|
|
"code.gitea.io/gitea/modules/setting"
|
2020-08-18 07:23:45 +03:00
|
|
|
"code.gitea.io/gitea/modules/storage"
|
2019-10-13 16:23:14 +03:00
|
|
|
"code.gitea.io/gitea/modules/structs"
|
2019-08-15 17:46:21 +03:00
|
|
|
"code.gitea.io/gitea/modules/timeutil"
|
2020-12-27 06:34:19 +03:00
|
|
|
"code.gitea.io/gitea/modules/uri"
|
2020-10-28 00:34:56 +03:00
|
|
|
"code.gitea.io/gitea/services/pull"
|
2019-05-07 04:12:51 +03:00
|
|
|
|
2020-06-18 12:18:44 +03:00
|
|
|
gouuid "github.com/google/uuid"
|
2019-05-07 04:12:51 +03:00
|
|
|
)
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
var _ base.Uploader = &GiteaLocalUploader{}
|
2019-05-07 04:12:51 +03:00
|
|
|
|
|
|
|
// GiteaLocalUploader implements an Uploader to gitea sites
|
|
|
|
type GiteaLocalUploader struct {
|
2019-12-17 07:16:54 +03:00
|
|
|
ctx context.Context
|
2021-11-24 12:49:20 +03:00
|
|
|
doer *user_model.User
|
2019-10-14 09:10:42 +03:00
|
|
|
repoOwner string
|
|
|
|
repoName string
|
2021-12-10 04:27:50 +03:00
|
|
|
repo *repo_model.Repository
|
2022-02-03 22:18:18 +03:00
|
|
|
labels map[string]*models.Label
|
|
|
|
milestones map[string]int64
|
|
|
|
issues map[int64]*models.Issue
|
2019-10-14 09:10:42 +03:00
|
|
|
gitRepo *git.Repository
|
|
|
|
prHeadCache map[string]struct{}
|
2022-02-06 12:05:29 +03:00
|
|
|
sameApp bool
|
2019-10-14 09:10:42 +03:00
|
|
|
userMap map[int64]int64 // external user id mapping to user id
|
2020-01-23 20:28:15 +03:00
|
|
|
prCache map[int64]*models.PullRequest
|
2019-10-14 09:10:42 +03:00
|
|
|
gitServiceType structs.GitServiceType
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
// NewGiteaLocalUploader creates an gitea Uploader via gitea API v1
|
2021-11-24 12:49:20 +03:00
|
|
|
func NewGiteaLocalUploader(ctx context.Context, doer *user_model.User, repoOwner, repoName string) *GiteaLocalUploader {
|
2019-05-07 04:12:51 +03:00
|
|
|
return &GiteaLocalUploader{
|
2019-12-17 07:16:54 +03:00
|
|
|
ctx: ctx,
|
2019-05-07 04:12:51 +03:00
|
|
|
doer: doer,
|
|
|
|
repoOwner: repoOwner,
|
|
|
|
repoName: repoName,
|
2022-02-03 22:18:18 +03:00
|
|
|
labels: make(map[string]*models.Label),
|
|
|
|
milestones: make(map[string]int64),
|
|
|
|
issues: make(map[int64]*models.Issue),
|
2019-05-07 04:12:51 +03:00
|
|
|
prHeadCache: make(map[string]struct{}),
|
2019-10-14 09:10:42 +03:00
|
|
|
userMap: make(map[int64]int64),
|
2020-01-23 20:28:15 +03:00
|
|
|
prCache: make(map[int64]*models.PullRequest),
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-07-06 22:24:50 +03:00
|
|
|
// MaxBatchInsertSize returns the table's max batch insert size
|
|
|
|
func (g *GiteaLocalUploader) MaxBatchInsertSize(tp string) int {
|
|
|
|
switch tp {
|
|
|
|
case "issue":
|
2021-09-19 14:49:59 +03:00
|
|
|
return db.MaxBatchInsertSize(new(models.Issue))
|
2019-07-06 22:24:50 +03:00
|
|
|
case "comment":
|
2021-09-19 14:49:59 +03:00
|
|
|
return db.MaxBatchInsertSize(new(models.Comment))
|
2019-07-06 22:24:50 +03:00
|
|
|
case "milestone":
|
2022-04-08 12:11:15 +03:00
|
|
|
return db.MaxBatchInsertSize(new(issues_model.Milestone))
|
2019-07-06 22:24:50 +03:00
|
|
|
case "label":
|
2021-09-19 14:49:59 +03:00
|
|
|
return db.MaxBatchInsertSize(new(models.Label))
|
2019-07-06 22:24:50 +03:00
|
|
|
case "release":
|
2021-09-19 14:49:59 +03:00
|
|
|
return db.MaxBatchInsertSize(new(models.Release))
|
2019-07-06 22:24:50 +03:00
|
|
|
case "pullrequest":
|
2021-09-19 14:49:59 +03:00
|
|
|
return db.MaxBatchInsertSize(new(models.PullRequest))
|
2019-07-06 22:24:50 +03:00
|
|
|
}
|
|
|
|
return 10
|
|
|
|
}
|
|
|
|
|
2020-12-27 06:34:19 +03:00
|
|
|
// CreateRepo creates a repository
|
|
|
|
func (g *GiteaLocalUploader) CreateRepo(repo *base.Repository, opts base.MigrateOptions) error {
|
2021-11-24 12:49:20 +03:00
|
|
|
owner, err := user_model.GetUserByName(g.repoOwner)
|
2020-12-27 06:34:19 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
2019-08-20 23:21:07 +03:00
|
|
|
}
|
|
|
|
|
2021-12-10 04:27:50 +03:00
|
|
|
var r *repo_model.Repository
|
2019-10-13 16:23:14 +03:00
|
|
|
if opts.MigrateToRepoID <= 0 {
|
2020-01-12 15:11:17 +03:00
|
|
|
r, err = repo_module.CreateRepository(g.doer, owner, models.CreateRepoOptions{
|
2020-01-10 18:35:17 +03:00
|
|
|
Name: g.repoName,
|
|
|
|
Description: repo.Description,
|
|
|
|
OriginalURL: repo.OriginalURL,
|
|
|
|
GitServiceType: opts.GitServiceType,
|
|
|
|
IsPrivate: opts.Private,
|
|
|
|
IsMirror: opts.Mirror,
|
2021-12-10 04:27:50 +03:00
|
|
|
Status: repo_model.RepositoryBeingMigrated,
|
2019-10-13 16:23:14 +03:00
|
|
|
})
|
|
|
|
} else {
|
2021-12-10 04:27:50 +03:00
|
|
|
r, err = repo_model.GetRepositoryByID(opts.MigrateToRepoID)
|
2019-10-13 16:23:14 +03:00
|
|
|
}
|
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2020-09-15 17:37:44 +03:00
|
|
|
r.DefaultBranch = repo.DefaultBranch
|
2021-11-18 18:28:10 +03:00
|
|
|
r.Description = repo.Description
|
2019-10-13 16:23:14 +03:00
|
|
|
|
2021-11-08 10:04:13 +03:00
|
|
|
r, err = repo_module.MigrateRepositoryGitData(g.ctx, owner, r, base.MigrateOptions{
|
2019-10-14 09:10:42 +03:00
|
|
|
RepoName: g.repoName,
|
|
|
|
Description: repo.Description,
|
|
|
|
OriginalURL: repo.OriginalURL,
|
|
|
|
GitServiceType: opts.GitServiceType,
|
|
|
|
Mirror: repo.IsMirror,
|
2021-04-09 01:25:57 +03:00
|
|
|
LFS: opts.LFS,
|
|
|
|
LFSEndpoint: opts.LFSEndpoint,
|
2021-01-21 22:33:58 +03:00
|
|
|
CloneAddr: repo.CloneURL,
|
2019-10-14 09:10:42 +03:00
|
|
|
Private: repo.IsPrivate,
|
|
|
|
Wiki: opts.Wiki,
|
|
|
|
Releases: opts.Releases, // if didn't get releases, then sync them from tags
|
2021-01-03 02:47:47 +03:00
|
|
|
MirrorInterval: opts.MirrorInterval,
|
2021-11-20 12:34:05 +03:00
|
|
|
}, NewMigrationHTTPTransport())
|
2019-10-13 16:23:14 +03:00
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
g.sameApp = strings.HasPrefix(repo.OriginalURL, setting.AppURL)
|
2019-05-26 00:18:27 +03:00
|
|
|
g.repo = r
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2022-03-29 22:13:41 +03:00
|
|
|
g.gitRepo, err = git.OpenRepository(g.ctx, r.RepoPath())
|
2019-05-07 04:12:51 +03:00
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2019-11-13 10:01:19 +03:00
|
|
|
// Close closes this uploader
|
|
|
|
func (g *GiteaLocalUploader) Close() {
|
|
|
|
if g.gitRepo != nil {
|
|
|
|
g.gitRepo.Close()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-08-14 09:16:12 +03:00
|
|
|
// CreateTopics creates topics
|
|
|
|
func (g *GiteaLocalUploader) CreateTopics(topics ...string) error {
|
2020-12-27 04:23:57 +03:00
|
|
|
// ignore topics to long for the db
|
|
|
|
c := 0
|
|
|
|
for i := range topics {
|
|
|
|
if len(topics[i]) <= 50 {
|
|
|
|
topics[c] = topics[i]
|
|
|
|
c++
|
|
|
|
}
|
|
|
|
}
|
|
|
|
topics = topics[:c]
|
2021-12-12 18:48:20 +03:00
|
|
|
return repo_model.SaveTopics(g.repo.ID, topics...)
|
2019-08-14 09:16:12 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
// CreateMilestones creates milestones
|
|
|
|
func (g *GiteaLocalUploader) CreateMilestones(milestones ...*base.Milestone) error {
|
2022-04-08 12:11:15 +03:00
|
|
|
mss := make([]*issues_model.Milestone, 0, len(milestones))
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, milestone := range milestones {
|
2019-08-15 17:46:21 +03:00
|
|
|
var deadline timeutil.TimeStamp
|
2019-06-29 16:38:22 +03:00
|
|
|
if milestone.Deadline != nil {
|
2019-08-15 17:46:21 +03:00
|
|
|
deadline = timeutil.TimeStamp(milestone.Deadline.Unix())
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
|
|
|
if deadline == 0 {
|
2019-08-15 17:46:21 +03:00
|
|
|
deadline = timeutil.TimeStamp(time.Date(9999, 1, 1, 0, 0, 0, 0, setting.DefaultUILocation).Unix())
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
2021-08-18 03:47:18 +03:00
|
|
|
|
|
|
|
if milestone.Created.IsZero() {
|
|
|
|
if milestone.Updated != nil {
|
|
|
|
milestone.Created = *milestone.Updated
|
|
|
|
} else if milestone.Deadline != nil {
|
|
|
|
milestone.Created = *milestone.Deadline
|
|
|
|
} else {
|
|
|
|
milestone.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if milestone.Updated == nil || milestone.Updated.IsZero() {
|
|
|
|
milestone.Updated = &milestone.Created
|
|
|
|
}
|
|
|
|
|
2022-04-08 12:11:15 +03:00
|
|
|
ms := issues_model.Milestone{
|
2019-06-29 16:38:22 +03:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Name: milestone.Title,
|
|
|
|
Content: milestone.Description,
|
2019-07-29 18:41:22 +03:00
|
|
|
IsClosed: milestone.State == "closed",
|
2021-08-18 03:47:18 +03:00
|
|
|
CreatedUnix: timeutil.TimeStamp(milestone.Created.Unix()),
|
|
|
|
UpdatedUnix: timeutil.TimeStamp(milestone.Updated.Unix()),
|
2019-06-29 16:38:22 +03:00
|
|
|
DeadlineUnix: deadline,
|
|
|
|
}
|
|
|
|
if ms.IsClosed && milestone.Closed != nil {
|
2019-08-15 17:46:21 +03:00
|
|
|
ms.ClosedDateUnix = timeutil.TimeStamp(milestone.Closed.Unix())
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
|
|
|
mss = append(mss, &ms)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
err := models.InsertMilestones(mss...)
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, ms := range mss {
|
2022-02-03 22:18:18 +03:00
|
|
|
g.milestones[ms.Name] = ms.ID
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
return nil
|
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
// CreateLabels creates labels
|
|
|
|
func (g *GiteaLocalUploader) CreateLabels(labels ...*base.Label) error {
|
2022-01-20 20:46:10 +03:00
|
|
|
lbs := make([]*models.Label, 0, len(labels))
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, label := range labels {
|
|
|
|
lbs = append(lbs, &models.Label{
|
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Name: label.Name,
|
|
|
|
Description: label.Description,
|
|
|
|
Color: fmt.Sprintf("#%s", label.Color),
|
|
|
|
})
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
err := models.NewLabels(lbs...)
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
2019-06-29 16:38:22 +03:00
|
|
|
return err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, lb := range lbs {
|
2022-02-03 22:18:18 +03:00
|
|
|
g.labels[lb.Name] = lb
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
return nil
|
|
|
|
}
|
2019-05-07 04:12:51 +03:00
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
// CreateReleases creates releases
|
2020-12-27 06:34:19 +03:00
|
|
|
func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error {
|
2022-01-20 20:46:10 +03:00
|
|
|
rels := make([]*models.Release, 0, len(releases))
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, release := range releases {
|
2021-08-18 03:47:18 +03:00
|
|
|
if release.Created.IsZero() {
|
|
|
|
if !release.Published.IsZero() {
|
|
|
|
release.Created = release.Published
|
|
|
|
} else {
|
|
|
|
release.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
rel := models.Release{
|
2019-10-14 09:10:42 +03:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
TagName: release.TagName,
|
|
|
|
LowerTagName: strings.ToLower(release.TagName),
|
|
|
|
Target: release.TargetCommitish,
|
|
|
|
Title: release.Name,
|
|
|
|
Note: release.Body,
|
|
|
|
IsDraft: release.Draft,
|
|
|
|
IsPrerelease: release.Prerelease,
|
|
|
|
IsTag: false,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(release.Created.Unix()),
|
|
|
|
}
|
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(release, &rel); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2022-04-27 02:24:06 +03:00
|
|
|
// calc NumCommits if possible
|
|
|
|
if rel.TagName != "" {
|
2021-05-29 23:04:58 +03:00
|
|
|
commit, err := g.gitRepo.GetTagCommit(rel.TagName)
|
2022-04-27 02:24:06 +03:00
|
|
|
if !errors.Is(err, git.ErrNotExist{}) {
|
|
|
|
if err != nil {
|
|
|
|
return fmt.Errorf("GetTagCommit[%v]: %v", rel.TagName, err)
|
|
|
|
}
|
|
|
|
rel.Sha1 = commit.ID.String()
|
|
|
|
rel.NumCommits, err = commit.CommitsCount()
|
|
|
|
if err != nil {
|
|
|
|
return fmt.Errorf("CommitsCount: %v", err)
|
|
|
|
}
|
2021-05-16 01:37:17 +03:00
|
|
|
}
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, asset := range release.Assets {
|
2021-08-18 03:47:18 +03:00
|
|
|
if asset.Created.IsZero() {
|
|
|
|
if !asset.Updated.IsZero() {
|
|
|
|
asset.Created = asset.Updated
|
|
|
|
} else {
|
|
|
|
asset.Created = release.Created
|
|
|
|
}
|
|
|
|
}
|
2022-01-20 20:46:10 +03:00
|
|
|
attach := repo_model.Attachment{
|
2020-06-18 12:18:44 +03:00
|
|
|
UUID: gouuid.New().String(),
|
2019-06-29 16:38:22 +03:00
|
|
|
Name: asset.Name,
|
|
|
|
DownloadCount: int64(*asset.DownloadCount),
|
|
|
|
Size: int64(*asset.Size),
|
2019-08-15 17:46:21 +03:00
|
|
|
CreatedUnix: timeutil.TimeStamp(asset.Created.Unix()),
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
// download attachment
|
2021-05-16 01:37:17 +03:00
|
|
|
err := func() error {
|
2020-12-27 06:34:19 +03:00
|
|
|
// asset.DownloadURL maybe a local file
|
2020-10-14 07:06:00 +03:00
|
|
|
var rc io.ReadCloser
|
2021-05-16 01:37:17 +03:00
|
|
|
var err error
|
2021-06-04 16:14:20 +03:00
|
|
|
if asset.DownloadFunc != nil {
|
2020-12-27 06:34:19 +03:00
|
|
|
rc, err = asset.DownloadFunc()
|
2020-10-14 07:06:00 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2021-06-04 16:14:20 +03:00
|
|
|
} else if asset.DownloadURL != nil {
|
2020-12-27 06:34:19 +03:00
|
|
|
rc, err = uri.Open(*asset.DownloadURL)
|
2020-10-14 07:06:00 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2019-11-05 15:54:47 +03:00
|
|
|
}
|
2021-06-04 16:14:20 +03:00
|
|
|
if rc == nil {
|
|
|
|
return nil
|
|
|
|
}
|
2021-04-03 19:19:59 +03:00
|
|
|
_, err = storage.Attachments.Save(attach.RelativePath(), rc, int64(*asset.Size))
|
2021-06-04 16:14:20 +03:00
|
|
|
rc.Close()
|
2019-11-05 15:54:47 +03:00
|
|
|
return err
|
|
|
|
}()
|
|
|
|
if err != nil {
|
2019-06-29 16:38:22 +03:00
|
|
|
return err
|
|
|
|
}
|
2020-12-27 06:34:19 +03:00
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
rel.Attachments = append(rel.Attachments, &attach)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
rels = append(rels, &rel)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
2019-07-02 00:17:16 +03:00
|
|
|
|
2019-12-12 03:20:11 +03:00
|
|
|
return models.InsertReleases(rels...)
|
|
|
|
}
|
2019-12-11 09:09:06 +03:00
|
|
|
|
2019-12-12 03:20:11 +03:00
|
|
|
// SyncTags syncs releases with tags in the database
|
|
|
|
func (g *GiteaLocalUploader) SyncTags() error {
|
2021-11-08 10:04:13 +03:00
|
|
|
return repo_module.SyncReleasesWithTags(g.repo, g.gitRepo)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
// CreateIssues creates issues
|
|
|
|
func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error {
|
2022-01-20 20:46:10 +03:00
|
|
|
iss := make([]*models.Issue, 0, len(issues))
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, issue := range issues {
|
|
|
|
var labels []*models.Label
|
|
|
|
for _, label := range issue.Labels {
|
2022-02-03 22:18:18 +03:00
|
|
|
lb, ok := g.labels[label.Name]
|
2019-06-29 16:38:22 +03:00
|
|
|
if ok {
|
2022-02-03 22:18:18 +03:00
|
|
|
labels = append(labels, lb)
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2022-02-03 22:18:18 +03:00
|
|
|
milestoneID := g.milestones[issue.Milestone]
|
2019-05-07 04:12:51 +03:00
|
|
|
|
2021-08-18 03:47:18 +03:00
|
|
|
if issue.Created.IsZero() {
|
|
|
|
if issue.Closed != nil {
|
|
|
|
issue.Created = *issue.Closed
|
|
|
|
} else {
|
|
|
|
issue.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if issue.Updated.IsZero() {
|
|
|
|
if issue.Closed != nil {
|
|
|
|
issue.Updated = *issue.Closed
|
|
|
|
} else {
|
|
|
|
issue.Updated = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
is := models.Issue{
|
2019-10-14 09:10:42 +03:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Repo: g.repo,
|
|
|
|
Index: issue.Number,
|
|
|
|
Title: issue.Title,
|
|
|
|
Content: issue.Content,
|
2020-12-13 14:34:11 +03:00
|
|
|
Ref: issue.Ref,
|
2019-10-14 09:10:42 +03:00
|
|
|
IsClosed: issue.State == "closed",
|
|
|
|
IsLocked: issue.IsLocked,
|
|
|
|
MilestoneID: milestoneID,
|
|
|
|
Labels: labels,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(issue.Created.Unix()),
|
2020-01-14 13:29:22 +03:00
|
|
|
UpdatedUnix: timeutil.TimeStamp(issue.Updated.Unix()),
|
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-17 20:08:35 +03:00
|
|
|
ForeignReference: &foreignreference.ForeignReference{
|
|
|
|
LocalIndex: issue.GetLocalIndex(),
|
|
|
|
ForeignIndex: strconv.FormatInt(issue.GetForeignIndex(), 10),
|
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Type: foreignreference.TypeIssue,
|
|
|
|
},
|
2019-10-14 09:10:42 +03:00
|
|
|
}
|
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(issue, &is); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
2019-10-14 09:10:42 +03:00
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
if issue.Closed != nil {
|
2019-08-15 17:46:21 +03:00
|
|
|
is.ClosedUnix = timeutil.TimeStamp(issue.Closed.Unix())
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
2020-01-15 14:14:07 +03:00
|
|
|
// add reactions
|
|
|
|
for _, reaction := range issue.Reactions {
|
2022-03-31 12:20:39 +03:00
|
|
|
res := issues_model.Reaction{
|
2020-01-15 14:14:07 +03:00
|
|
|
Type: reaction.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStampNow(),
|
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(reaction, &res); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2020-01-15 14:14:07 +03:00
|
|
|
}
|
|
|
|
is.Reactions = append(is.Reactions, &res)
|
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
iss = append(iss, &is)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2020-04-17 20:42:57 +03:00
|
|
|
if len(iss) > 0 {
|
|
|
|
if err := models.InsertIssues(iss...); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
|
|
|
for _, is := range iss {
|
2022-02-03 22:18:18 +03:00
|
|
|
g.issues[is.Index] = is
|
2020-04-17 20:42:57 +03:00
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
2020-04-17 20:42:57 +03:00
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
return nil
|
|
|
|
}
|
|
|
|
|
|
|
|
// CreateComments creates comments of issues
|
|
|
|
func (g *GiteaLocalUploader) CreateComments(comments ...*base.Comment) error {
|
2022-01-20 20:46:10 +03:00
|
|
|
cms := make([]*models.Comment, 0, len(comments))
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, comment := range comments {
|
2021-08-18 03:47:18 +03:00
|
|
|
var issue *models.Issue
|
2022-02-03 22:18:18 +03:00
|
|
|
issue, ok := g.issues[comment.IssueIndex]
|
2021-08-18 03:47:18 +03:00
|
|
|
if !ok {
|
2022-02-21 16:00:05 +03:00
|
|
|
return fmt.Errorf("comment references non existent IssueIndex %d", comment.IssueIndex)
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
|
|
|
|
2021-08-18 03:47:18 +03:00
|
|
|
if comment.Created.IsZero() {
|
|
|
|
comment.Created = time.Unix(int64(issue.CreatedUnix), 0)
|
|
|
|
}
|
|
|
|
if comment.Updated.IsZero() {
|
|
|
|
comment.Updated = comment.Created
|
|
|
|
}
|
|
|
|
|
2019-10-14 09:10:42 +03:00
|
|
|
cm := models.Comment{
|
2021-08-18 03:47:18 +03:00
|
|
|
IssueID: issue.ID,
|
2019-10-14 09:10:42 +03:00
|
|
|
Type: models.CommentTypeComment,
|
|
|
|
Content: comment.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(comment.Created.Unix()),
|
2020-01-14 13:29:22 +03:00
|
|
|
UpdatedUnix: timeutil.TimeStamp(comment.Updated.Unix()),
|
2019-10-14 09:10:42 +03:00
|
|
|
}
|
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(comment, &cm); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2019-10-14 09:10:42 +03:00
|
|
|
}
|
|
|
|
|
2020-01-15 14:14:07 +03:00
|
|
|
// add reactions
|
|
|
|
for _, reaction := range comment.Reactions {
|
2022-03-31 12:20:39 +03:00
|
|
|
res := issues_model.Reaction{
|
2020-01-15 14:14:07 +03:00
|
|
|
Type: reaction.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStampNow(),
|
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(reaction, &res); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2020-01-15 14:14:07 +03:00
|
|
|
}
|
|
|
|
cm.Reactions = append(cm.Reactions, &res)
|
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
|
2020-01-15 14:14:07 +03:00
|
|
|
cms = append(cms, &cm)
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
|
|
|
|
2020-04-17 20:42:57 +03:00
|
|
|
if len(cms) == 0 {
|
|
|
|
return nil
|
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
return models.InsertIssueComments(cms)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
// CreatePullRequests creates pull requests
|
|
|
|
func (g *GiteaLocalUploader) CreatePullRequests(prs ...*base.PullRequest) error {
|
2022-01-20 20:46:10 +03:00
|
|
|
gprs := make([]*models.PullRequest, 0, len(prs))
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, pr := range prs {
|
|
|
|
gpr, err := g.newPullRequest(pr)
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2019-10-14 09:10:42 +03:00
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(pr, gpr.Issue); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2019-10-14 09:10:42 +03:00
|
|
|
}
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
gprs = append(gprs, gpr)
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
if err := models.InsertPullRequests(gprs...); err != nil {
|
|
|
|
return err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
2019-06-29 16:38:22 +03:00
|
|
|
for _, pr := range gprs {
|
2022-02-03 22:18:18 +03:00
|
|
|
g.issues[pr.Issue.Index] = pr.Issue
|
2020-10-28 00:34:56 +03:00
|
|
|
pull.AddToTaskQueue(pr)
|
2019-06-29 16:38:22 +03:00
|
|
|
}
|
|
|
|
return nil
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2022-02-25 12:20:50 +03:00
|
|
|
func (g *GiteaLocalUploader) updateGitForPullRequest(pr *base.PullRequest) (head string, err error) {
|
2019-05-07 04:12:51 +03:00
|
|
|
// download patch file
|
2022-02-25 12:20:50 +03:00
|
|
|
err = func() error {
|
2021-08-22 01:47:45 +03:00
|
|
|
if pr.PatchURL == "" {
|
|
|
|
return nil
|
|
|
|
}
|
2020-12-27 06:34:19 +03:00
|
|
|
// pr.PatchURL maybe a local file
|
|
|
|
ret, err := uri.Open(pr.PatchURL)
|
2019-11-05 15:54:47 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2020-12-27 06:34:19 +03:00
|
|
|
defer ret.Close()
|
2019-11-05 15:54:47 +03:00
|
|
|
pullDir := filepath.Join(g.repo.RepoPath(), "pulls")
|
|
|
|
if err = os.MkdirAll(pullDir, os.ModePerm); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
f, err := os.Create(filepath.Join(pullDir, fmt.Sprintf("%d.patch", pr.Number)))
|
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
defer f.Close()
|
2020-12-27 06:34:19 +03:00
|
|
|
_, err = io.Copy(f, ret)
|
2019-11-05 15:54:47 +03:00
|
|
|
return err
|
|
|
|
}()
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
// set head information
|
|
|
|
pullHead := filepath.Join(g.repo.RepoPath(), "refs", "pull", fmt.Sprintf("%d", pr.Number))
|
|
|
|
if err := os.MkdirAll(pullHead, os.ModePerm); err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
p, err := os.Create(filepath.Join(pullHead, "head"))
|
|
|
|
if err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
_, err = p.WriteString(pr.Head.SHA)
|
2019-11-05 15:54:47 +03:00
|
|
|
p.Close()
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2022-02-25 12:20:50 +03:00
|
|
|
head = "unknown repository"
|
2019-07-14 12:16:15 +03:00
|
|
|
if pr.IsForkPullRequest() && pr.State != "closed" {
|
2019-05-07 04:12:51 +03:00
|
|
|
if pr.Head.OwnerName != "" {
|
|
|
|
remote := pr.Head.OwnerName
|
|
|
|
_, ok := g.prHeadCache[remote]
|
|
|
|
if !ok {
|
|
|
|
// git remote add
|
|
|
|
err := g.gitRepo.AddRemote(remote, pr.Head.CloneURL, true)
|
|
|
|
if err != nil {
|
|
|
|
log.Error("AddRemote failed: %s", err)
|
|
|
|
} else {
|
|
|
|
g.prHeadCache[remote] = struct{}{}
|
|
|
|
ok = true
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
if ok {
|
2022-04-25 16:07:08 +03:00
|
|
|
_, _, err = git.NewCommand(g.ctx, "fetch", "--no-tags", "--", remote, pr.Head.Ref).RunStdString(&git.RunOpts{Dir: g.repo.RepoPath()})
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
|
|
|
log.Error("Fetch branch from %s failed: %v", pr.Head.CloneURL, err)
|
|
|
|
} else {
|
|
|
|
headBranch := filepath.Join(g.repo.RepoPath(), "refs", "heads", pr.Head.OwnerName, pr.Head.Ref)
|
|
|
|
if err := os.MkdirAll(filepath.Dir(headBranch), os.ModePerm); err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
b, err := os.Create(headBranch)
|
|
|
|
if err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
_, err = b.WriteString(pr.Head.SHA)
|
2019-11-05 15:54:47 +03:00
|
|
|
b.Close()
|
2019-05-07 04:12:51 +03:00
|
|
|
if err != nil {
|
2022-02-25 12:20:50 +03:00
|
|
|
return "", err
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
head = pr.Head.OwnerName + "/" + pr.Head.Ref
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
head = pr.Head.Ref
|
2021-10-08 12:59:35 +03:00
|
|
|
// Ensure the closed PR SHA still points to an existing ref
|
2022-04-01 05:55:30 +03:00
|
|
|
_, _, err = git.NewCommand(g.ctx, "rev-list", "--quiet", "-1", pr.Head.SHA).RunStdString(&git.RunOpts{Dir: g.repo.RepoPath()})
|
2021-10-08 12:59:35 +03:00
|
|
|
if err != nil {
|
|
|
|
if pr.Head.SHA != "" {
|
|
|
|
// Git update-ref remove bad references with a relative path
|
|
|
|
log.Warn("Deprecated local head, removing : %v", pr.Head.SHA)
|
2021-12-23 16:44:00 +03:00
|
|
|
err = g.gitRepo.RemoveReference(pr.GetGitRefName())
|
2021-10-08 12:59:35 +03:00
|
|
|
} else {
|
|
|
|
// The SHA is empty, remove the head file
|
|
|
|
log.Warn("Empty reference, removing : %v", pullHead)
|
|
|
|
err = os.Remove(filepath.Join(pullHead, "head"))
|
|
|
|
}
|
|
|
|
if err != nil {
|
|
|
|
log.Error("Cannot remove local head ref, %v", err)
|
|
|
|
}
|
|
|
|
}
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2022-02-25 12:20:50 +03:00
|
|
|
return head, nil
|
|
|
|
}
|
|
|
|
|
|
|
|
func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*models.PullRequest, error) {
|
|
|
|
var labels []*models.Label
|
|
|
|
for _, label := range pr.Labels {
|
|
|
|
lb, ok := g.labels[label.Name]
|
|
|
|
if ok {
|
|
|
|
labels = append(labels, lb)
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
milestoneID := g.milestones[pr.Milestone]
|
|
|
|
|
|
|
|
head, err := g.updateGitForPullRequest(pr)
|
|
|
|
if err != nil {
|
|
|
|
return nil, fmt.Errorf("updateGitForPullRequest: %w", err)
|
|
|
|
}
|
|
|
|
|
2021-08-18 03:47:18 +03:00
|
|
|
if pr.Created.IsZero() {
|
|
|
|
if pr.Closed != nil {
|
|
|
|
pr.Created = *pr.Closed
|
|
|
|
} else if pr.MergedTime != nil {
|
|
|
|
pr.Created = *pr.MergedTime
|
|
|
|
} else {
|
|
|
|
pr.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if pr.Updated.IsZero() {
|
|
|
|
pr.Updated = pr.Created
|
|
|
|
}
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
issue := models.Issue{
|
2019-10-14 09:10:42 +03:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Repo: g.repo,
|
|
|
|
Title: pr.Title,
|
|
|
|
Index: pr.Number,
|
|
|
|
Content: pr.Content,
|
|
|
|
MilestoneID: milestoneID,
|
|
|
|
IsPull: true,
|
|
|
|
IsClosed: pr.State == "closed",
|
|
|
|
IsLocked: pr.IsLocked,
|
|
|
|
Labels: labels,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(pr.Created.Unix()),
|
2020-01-14 13:29:22 +03:00
|
|
|
UpdatedUnix: timeutil.TimeStamp(pr.Updated.Unix()),
|
2019-10-14 09:10:42 +03:00
|
|
|
}
|
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(pr, &issue); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return nil, err
|
2019-10-14 09:10:42 +03:00
|
|
|
}
|
|
|
|
|
2020-01-15 14:14:07 +03:00
|
|
|
// add reactions
|
|
|
|
for _, reaction := range pr.Reactions {
|
2022-03-31 12:20:39 +03:00
|
|
|
res := issues_model.Reaction{
|
2020-01-15 14:14:07 +03:00
|
|
|
Type: reaction.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStampNow(),
|
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(reaction, &res); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return nil, err
|
2020-01-15 14:14:07 +03:00
|
|
|
}
|
|
|
|
issue.Reactions = append(issue.Reactions, &res)
|
|
|
|
}
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
pullRequest := models.PullRequest{
|
2019-10-18 14:13:31 +03:00
|
|
|
HeadRepoID: g.repo.ID,
|
|
|
|
HeadBranch: head,
|
|
|
|
BaseRepoID: g.repo.ID,
|
|
|
|
BaseBranch: pr.Base.Ref,
|
|
|
|
MergeBase: pr.Base.SHA,
|
|
|
|
Index: pr.Number,
|
|
|
|
HasMerged: pr.Merged,
|
2019-05-07 04:12:51 +03:00
|
|
|
|
2019-10-14 09:10:42 +03:00
|
|
|
Issue: &issue,
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
if pullRequest.Issue.IsClosed && pr.Closed != nil {
|
2019-08-15 17:46:21 +03:00
|
|
|
pullRequest.Issue.ClosedUnix = timeutil.TimeStamp(pr.Closed.Unix())
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
if pullRequest.HasMerged && pr.MergedTime != nil {
|
2019-08-15 17:46:21 +03:00
|
|
|
pullRequest.MergedUnix = timeutil.TimeStamp(pr.MergedTime.Unix())
|
2019-05-07 04:12:51 +03:00
|
|
|
pullRequest.MergedCommitID = pr.MergeCommitSHA
|
|
|
|
pullRequest.MergerID = g.doer.ID
|
|
|
|
}
|
|
|
|
|
|
|
|
// TODO: assignees
|
|
|
|
|
2019-06-29 16:38:22 +03:00
|
|
|
return &pullRequest, nil
|
2019-05-07 04:12:51 +03:00
|
|
|
}
|
|
|
|
|
2020-01-23 20:28:15 +03:00
|
|
|
func convertReviewState(state string) models.ReviewType {
|
|
|
|
switch state {
|
|
|
|
case base.ReviewStatePending:
|
|
|
|
return models.ReviewTypePending
|
|
|
|
case base.ReviewStateApproved:
|
|
|
|
return models.ReviewTypeApprove
|
|
|
|
case base.ReviewStateChangesRequested:
|
|
|
|
return models.ReviewTypeReject
|
|
|
|
case base.ReviewStateCommented:
|
|
|
|
return models.ReviewTypeComment
|
|
|
|
default:
|
|
|
|
return models.ReviewTypePending
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-02-21 16:00:05 +03:00
|
|
|
// CreateReviews create pull request reviews of currently migrated issues
|
2020-01-23 20:28:15 +03:00
|
|
|
func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error {
|
2022-01-20 20:46:10 +03:00
|
|
|
cms := make([]*models.Review, 0, len(reviews))
|
2020-01-23 20:28:15 +03:00
|
|
|
for _, review := range reviews {
|
2021-08-18 03:47:18 +03:00
|
|
|
var issue *models.Issue
|
2022-02-03 22:18:18 +03:00
|
|
|
issue, ok := g.issues[review.IssueIndex]
|
2021-08-18 03:47:18 +03:00
|
|
|
if !ok {
|
2022-02-21 16:00:05 +03:00
|
|
|
return fmt.Errorf("review references non existent IssueIndex %d", review.IssueIndex)
|
2020-01-23 20:28:15 +03:00
|
|
|
}
|
2021-08-18 03:47:18 +03:00
|
|
|
if review.CreatedAt.IsZero() {
|
|
|
|
review.CreatedAt = time.Unix(int64(issue.CreatedUnix), 0)
|
|
|
|
}
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
cm := models.Review{
|
2020-01-23 20:28:15 +03:00
|
|
|
Type: convertReviewState(review.State),
|
2021-08-18 03:47:18 +03:00
|
|
|
IssueID: issue.ID,
|
2020-01-23 20:28:15 +03:00
|
|
|
Content: review.Content,
|
|
|
|
Official: review.Official,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(review.CreatedAt.Unix()),
|
|
|
|
UpdatedUnix: timeutil.TimeStamp(review.CreatedAt.Unix()),
|
|
|
|
}
|
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(review, &cm); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2020-01-23 20:28:15 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
// get pr
|
2021-08-18 03:47:18 +03:00
|
|
|
pr, ok := g.prCache[issue.ID]
|
2020-01-23 20:28:15 +03:00
|
|
|
if !ok {
|
|
|
|
var err error
|
2021-08-18 03:47:18 +03:00
|
|
|
pr, err = models.GetPullRequestByIssueIDWithNoAttributes(issue.ID)
|
2020-01-23 20:28:15 +03:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2021-08-18 03:47:18 +03:00
|
|
|
g.prCache[issue.ID] = pr
|
2020-01-23 20:28:15 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
for _, comment := range review.Comments {
|
2020-10-14 07:06:00 +03:00
|
|
|
line := comment.Line
|
|
|
|
if line != 0 {
|
|
|
|
comment.Position = 1
|
|
|
|
} else {
|
|
|
|
_, _, line, _ = git.ParseDiffHunkString(comment.DiffHunk)
|
|
|
|
}
|
2020-01-23 20:28:15 +03:00
|
|
|
headCommitID, err := g.gitRepo.GetRefCommitID(pr.GetGitRefName())
|
|
|
|
if err != nil {
|
2021-09-01 14:33:07 +03:00
|
|
|
log.Warn("GetRefCommitID[%s]: %v, the review comment will be ignored", pr.GetGitRefName(), err)
|
|
|
|
continue
|
2020-01-23 20:28:15 +03:00
|
|
|
}
|
2020-01-28 11:02:03 +03:00
|
|
|
|
|
|
|
var patch string
|
2021-02-27 21:46:14 +03:00
|
|
|
reader, writer := io.Pipe()
|
|
|
|
defer func() {
|
|
|
|
_ = reader.Close()
|
|
|
|
_ = writer.Close()
|
|
|
|
}()
|
2022-04-15 17:50:09 +03:00
|
|
|
go func(comment *base.ReviewComment) {
|
2021-02-27 21:46:14 +03:00
|
|
|
if err := git.GetRepoRawDiffForFile(g.gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, comment.TreePath, writer); err != nil {
|
|
|
|
// We should ignore the error since the commit maybe removed when force push to the pull request
|
|
|
|
log.Warn("GetRepoRawDiffForFile failed when migrating [%s, %s, %s, %s]: %v", g.gitRepo.Path, pr.MergeBase, headCommitID, comment.TreePath, err)
|
|
|
|
}
|
|
|
|
_ = writer.Close()
|
2022-04-15 17:50:09 +03:00
|
|
|
}(comment)
|
2021-02-27 21:46:14 +03:00
|
|
|
|
|
|
|
patch, _ = git.CutDiffAroundLine(reader, int64((&models.Comment{Line: int64(line + comment.Position - 1)}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
|
2020-01-23 20:28:15 +03:00
|
|
|
|
2021-08-18 03:47:18 +03:00
|
|
|
if comment.CreatedAt.IsZero() {
|
|
|
|
comment.CreatedAt = review.CreatedAt
|
|
|
|
}
|
|
|
|
if comment.UpdatedAt.IsZero() {
|
|
|
|
comment.UpdatedAt = comment.CreatedAt
|
|
|
|
}
|
|
|
|
|
2022-01-20 20:46:10 +03:00
|
|
|
c := models.Comment{
|
2020-01-23 20:28:15 +03:00
|
|
|
Type: models.CommentTypeCode,
|
2021-08-18 03:47:18 +03:00
|
|
|
IssueID: issue.ID,
|
2020-01-23 20:28:15 +03:00
|
|
|
Content: comment.Content,
|
|
|
|
Line: int64(line + comment.Position - 1),
|
|
|
|
TreePath: comment.TreePath,
|
|
|
|
CommitSHA: comment.CommitID,
|
|
|
|
Patch: patch,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(comment.CreatedAt.Unix()),
|
|
|
|
UpdatedUnix: timeutil.TimeStamp(comment.UpdatedAt.Unix()),
|
|
|
|
}
|
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
if err := g.remapUser(review, &c); err != nil {
|
2022-02-01 21:20:28 +03:00
|
|
|
return err
|
2020-01-23 20:28:15 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
cm.Comments = append(cm.Comments, &c)
|
|
|
|
}
|
|
|
|
|
|
|
|
cms = append(cms, &cm)
|
|
|
|
}
|
|
|
|
|
|
|
|
return models.InsertReviews(cms)
|
|
|
|
}
|
|
|
|
|
2019-05-07 04:12:51 +03:00
|
|
|
// Rollback when migrating failed, this will rollback all the changes.
|
|
|
|
func (g *GiteaLocalUploader) Rollback() error {
|
|
|
|
if g.repo != nil && g.repo.ID > 0 {
|
2021-05-14 23:19:38 +03:00
|
|
|
g.gitRepo.Close()
|
2019-05-07 04:12:51 +03:00
|
|
|
if err := models.DeleteRepository(g.doer, g.repo.OwnerID, g.repo.ID); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return nil
|
|
|
|
}
|
2020-12-27 06:34:19 +03:00
|
|
|
|
|
|
|
// Finish when migrating success, this will do some status update things.
|
|
|
|
func (g *GiteaLocalUploader) Finish() error {
|
|
|
|
if g.repo == nil || g.repo.ID <= 0 {
|
|
|
|
return ErrRepoNotCreated
|
|
|
|
}
|
|
|
|
|
2021-08-13 16:06:18 +03:00
|
|
|
// update issue_index
|
|
|
|
if err := models.RecalculateIssueIndexForRepo(g.repo.ID); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2022-03-22 18:22:54 +03:00
|
|
|
if err := models.UpdateRepoStats(g.ctx, g.repo.ID); err != nil {
|
2022-01-17 21:31:58 +03:00
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2021-12-10 04:27:50 +03:00
|
|
|
g.repo.Status = repo_model.RepositoryReady
|
2021-12-12 18:48:20 +03:00
|
|
|
return repo_model.UpdateRepositoryCols(g.repo, "status")
|
2020-12-27 06:34:19 +03:00
|
|
|
}
|
2022-02-01 21:20:28 +03:00
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
func (g *GiteaLocalUploader) remapUser(source user_model.ExternalUserMigrated, target user_model.ExternalUserRemappable) error {
|
|
|
|
var userid int64
|
|
|
|
var err error
|
|
|
|
if g.sameApp {
|
|
|
|
userid, err = g.remapLocalUser(source, target)
|
|
|
|
} else {
|
|
|
|
userid, err = g.remapExternalUser(source, target)
|
|
|
|
}
|
|
|
|
|
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
|
|
|
if userid > 0 {
|
|
|
|
return target.RemapExternalUser("", 0, userid)
|
|
|
|
}
|
|
|
|
return target.RemapExternalUser(source.GetExternalName(), source.GetExternalID(), g.doer.ID)
|
|
|
|
}
|
|
|
|
|
|
|
|
func (g *GiteaLocalUploader) remapLocalUser(source user_model.ExternalUserMigrated, target user_model.ExternalUserRemappable) (int64, error) {
|
2022-02-01 21:20:28 +03:00
|
|
|
userid, ok := g.userMap[source.GetExternalID()]
|
2022-02-06 12:05:29 +03:00
|
|
|
if !ok {
|
|
|
|
name, err := user_model.GetUserNameByID(g.ctx, source.GetExternalID())
|
2022-02-01 21:20:28 +03:00
|
|
|
if err != nil {
|
2022-02-06 12:05:29 +03:00
|
|
|
return 0, err
|
2022-02-01 21:20:28 +03:00
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
// let's not reuse an ID when the user was deleted or has a different user name
|
|
|
|
if name != source.GetExternalName() {
|
|
|
|
userid = 0
|
|
|
|
} else {
|
|
|
|
userid = source.GetExternalID()
|
2022-02-01 21:20:28 +03:00
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
g.userMap[source.GetExternalID()] = userid
|
2022-02-01 21:20:28 +03:00
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
return userid, nil
|
|
|
|
}
|
2022-02-01 21:20:28 +03:00
|
|
|
|
2022-02-06 12:05:29 +03:00
|
|
|
func (g *GiteaLocalUploader) remapExternalUser(source user_model.ExternalUserMigrated, target user_model.ExternalUserRemappable) (userid int64, err error) {
|
|
|
|
userid, ok := g.userMap[source.GetExternalID()]
|
|
|
|
if !ok {
|
|
|
|
userid, err = user_model.GetUserIDByExternalUserID(g.gitServiceType.Name(), fmt.Sprintf("%d", source.GetExternalID()))
|
|
|
|
if err != nil {
|
|
|
|
log.Error("GetUserIDByExternalUserID: %v", err)
|
|
|
|
return 0, err
|
|
|
|
}
|
|
|
|
g.userMap[source.GetExternalID()] = userid
|
2022-02-01 21:20:28 +03:00
|
|
|
}
|
2022-02-06 12:05:29 +03:00
|
|
|
return userid, nil
|
2022-02-01 21:20:28 +03:00
|
|
|
}
|