Skip to content

Commit 1b5e3de

Browse files
authored
Merge pull request #35219 from github/repo-sync
Repo sync
2 parents aec2b02 + 1a99ce6 commit 1b5e3de

122 files changed

Lines changed: 4477 additions & 2463 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.env.example

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# This file is a template for what your untracked .env file might look like for local development.
2+
# Please copy this to a new .env file and fill in the values as needed.
3+
4+
# Requires a running local Elasticsearch service. Can be started via Docker, see https://github.com/github/docs-engineering/blob/main/docs/elasticsearch/elasticsearch-locally.md
5+
# When this value is unset searches will be proxied to the production Elasticsearch endpoint
6+
ELASTICSEARCH_URL=http://localhost:9200
7+
8+
# Set for sending events in local development. See https://github.com/github/docs-engineering/blob/main/docs/analytics/hydro-mock.md
9+
HYDRO_ENDPOINT=
10+
HYDRO_SECRET=
11+
12+
# Localization variables
13+
# See https://github.com/github/docs-internal/tree/main/src/languages#working-with-translated-content-locally
14+
ENABLED_LANGUAGES=
15+
TRANSLATIONS_ROOT=
16+
17+
# For running the src/search/scripts/scrape script
18+
# You may want a lower value depending on your CPU
19+
BUILD_RECORDS_MAX_CONCURRENT=100
20+
BUILD_RECORDS_MIN_TIME=
21+
22+
# Set to true to enable the /fastly-cache-test route for debugging Fastly headers
23+
ENABLE_FASTLY_TESTING=

.github/workflows/index-autocomplete-elasticsearch.yml renamed to .github/workflows/index-autocomplete-search.yml

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
name: Index autocomplete Elasticsearch
1+
name: Index autocomplete search in Elasticsearch
22

3-
# **What it does**: Indexes autocomplete data into Elasticsearch.
4-
# **Why we have it**: So we can power the API for autocomplete.
3+
# **What it does**: Indexes autocomplete data (general and AI search) into Elasticsearch.
4+
# **Why we have it**: So we can power the APIs for autocomplete.
55
# **Who does it impact**: docs-engineering
66

77
on:
@@ -10,7 +10,7 @@ on:
1010
- cron: '20 16 * * *' # Run every day at 16:20 UTC / 8:20 PST
1111
pull_request:
1212
paths:
13-
- .github/workflows/index-autocomplete-elasticsearch.yml
13+
- .github/workflows/index-autocomplete-search.yml
1414
- 'src/search/scripts/index/**'
1515
- 'package*.json'
1616

@@ -40,10 +40,15 @@ jobs:
4040
if: ${{ github.event_name == 'pull_request' }}
4141
run: curl --fail --retry-connrefused --retry 5 -I http://localhost:9200
4242

43-
- name: Run indexing
43+
- name: Run general auto-complete indexing
4444
env:
4545
ELASTICSEARCH_URL: ${{ github.event_name == 'pull_request' && 'http://localhost:9200' || secrets.ELASTICSEARCH_URL }}
46-
run: npm run index -- autocomplete docs-internal-data
46+
run: npm run index-general-autocomplete -- docs-internal-data
47+
48+
- name: Run AI search auto-complete indexing
49+
env:
50+
ELASTICSEARCH_URL: ${{ github.event_name == 'pull_request' && 'http://localhost:9200' || secrets.ELASTICSEARCH_URL }}
51+
run: npm run index-ai-search-autocomplete -- docs-internal-data
4752

4853
- uses: ./.github/actions/slack-alert
4954
if: ${{ failure() && github.event_name == 'schedule' }}

.github/workflows/sync-search-pr.yml renamed to .github/workflows/index-general-search-pr.yml

Lines changed: 9 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
name: Sync search - PR
1+
name: Index general search in Elasticsearch on PR
22

3-
# **What it does**: This does what `sync-sarch-elasticsearch.yml` does but
3+
# **What it does**: This does what `index-general-search-elasticsearch.yml` does but
44
# with a localhost Elasticsearch and only for English.
55
# **Why we have it**: To test that the script works and the popular pages json is valid.
66
# **Who does it impact**: Docs engineering
@@ -11,8 +11,8 @@ on:
1111
paths:
1212
- 'src/search/**'
1313
- 'package*.json'
14-
# Ultimately, for debugging this workflow itself
15-
- .github/workflows/sync-search-pr.yml
14+
# For debugging this workflow
15+
- .github/workflows/index-general-search-pr.yml
1616
# Make sure we run this if the composite action changes
1717
- .github/actions/setup-elasticsearch/action.yml
1818

@@ -25,9 +25,6 @@ concurrency:
2525
cancel-in-progress: true
2626

2727
env:
28-
# Yes, it's hardcoded but it makes all the steps look exactly the same
29-
# as they do in `sync-search-elasticsearch.yml` where it uses
30-
# that `${{ env.ELASTICSEARCH_URL }}`
3128
ELASTICSEARCH_URL: http://localhost:9200
3229
# Since we'll run in NDOE_ENV=production, we need to be explicit that
3330
# we don't want Hydro configured.
@@ -63,7 +60,7 @@ jobs:
6360
env:
6461
ENABLE_DEV_LOGGING: false
6562
run: |
66-
npm run sync-search-server > /tmp/stdout.log 2> /tmp/stderr.log &
63+
npm run general-search-scrape-server > /tmp/stdout.log 2> /tmp/stderr.log &
6764
6865
# first sleep to give it a chance to start
6966
sleep 6
@@ -88,15 +85,13 @@ jobs:
8885
# let's just accept an empty string instead.
8986
THROW_ON_EMPTY: false
9087

91-
# The sync-search-index recognizes this env var if you don't
92-
# use the `--docs-internal-data <PATH>` option.
9388
DOCS_INTERNAL_DATA: docs-internal-data
9489

9590
run: |
9691
mkdir /tmp/records
97-
npm run sync-search-indices -- /tmp/records \
92+
npm run general-search-scrape -- /tmp/records \
9893
--language en \
99-
--version dotcom
94+
--version fpt
10095
10196
ls -lh /tmp/records
10297
@@ -106,9 +101,9 @@ jobs:
106101
107102
- name: Index into Elasticsearch
108103
run: |
109-
npm run index-elasticsearch -- /tmp/records \
104+
npm run index-general-search -- /tmp/records \
110105
--language en \
111-
--version dotcom
106+
--version fpt
112107
113108
- name: Check created indexes and aliases
114109
run: |

.github/workflows/sync-search-elasticsearch.yml renamed to .github/workflows/index-general-search.yml

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name: Sync search Elasticsearch
1+
name: Index general search in Elasticsearch
22

33
# **What it does**: It scrapes the whole site and dumps the records in a
44
# temp directory. Then it indexes that into Elasticsearch.
@@ -140,7 +140,7 @@ jobs:
140140
env:
141141
ENABLE_DEV_LOGGING: false
142142
run: |
143-
npm run sync-search-server > /tmp/stdout.log 2> /tmp/stderr.log &
143+
npm run general-search-scrape-server > /tmp/stdout.log 2> /tmp/stderr.log &
144144
145145
# first sleep to give it a chance to start
146146
sleep 6
@@ -169,13 +169,11 @@ jobs:
169169
# the same as not set within the script.
170170
VERSION: ${{ inputs.version }}
171171

172-
# The sync-search-index recognizes this env var if you don't
173-
# use the `--docs-internal-data <PATH>` option.
174172
DOCS_INTERNAL_DATA: docs-internal-data
175173

176174
run: |
177175
mkdir /tmp/records
178-
npm run sync-search-indices -- /tmp/records \
176+
npm run general-search-scrape -- /tmp/records \
179177
--language ${{ matrix.language }}
180178
181179
ls -lh /tmp/records
@@ -186,12 +184,12 @@ jobs:
186184
187185
- name: Index into Elasticsearch
188186
env:
189-
# Must match what we used when scraping (npm run sync-search-indices)
187+
# Must match what we used when scraping (npm run general-search-scrape)
190188
# otherwise the script will seek other versions from disk that might
191189
# not exist.
192190
VERSION: ${{ inputs.version }}
193191
run: |
194-
npm run index-elasticsearch -- /tmp/records \
192+
npm run index-general-search -- /tmp/records \
195193
--language ${{ matrix.language }} \
196194
--stagger-seconds 5 \
197195
--retries 5

.gitignore

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,3 +51,9 @@ assets/images/help/writing/unordered-list-rendered (1).png
5151

5252
# Used by precompute-pageinfo
5353
.pageinfo-cache.json.br
54+
55+
# Cloned and used for indexing Elasticsearch data
56+
docs-internal-data/
57+
58+
# For intermediate data (like scraping for Elasticsearch indexing)
59+
tmp/

content/admin/upgrading-your-instance/troubleshooting-upgrades/known-issues-with-upgrades-to-your-instance.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -212,3 +212,30 @@ If your appliance averages more than 70% CPU utilization, {% data variables.prod
212212

213213
As part of upgrading GitHub Enterprise Server to version 3.13 or later, the Elasticsearch service will be upgraded. {% data variables.product.company_short %} strongly recommends following the guidance in "[AUTOTITLE](/admin/upgrading-your-instance/performing-an-upgrade/preparing-for-the-elasticsearch-upgrade)."
214214
{% endif %}
215+
216+
{% ifversion ghes > 3.12 and ghes < 3.15 %}
217+
218+
## Undecryptable records
219+
220+
If you are upgrading from {% data variables.product.prodname_ghe_server %} 3.11 or 3.12 to 3.13, or from 3.12 to 3.14, you may run into an issue with undecryptable records due to missing required keys for decryption. The only solution is to delete the undecryptable records. The type of records impacted by this issue are 2FA records, that means you might need to ask users to re-enable two-factor authentication (2FA).
221+
222+
### Before upgrading
223+
224+
If you are upgrading from {% data variables.product.prodname_ghe_server %} 3.11 or 3.12 to 3.13, or from 3.12 to 3.14, you can run the encryption diagnostics script to identify the undecryptable records ahead of time. This will give you the opportunity to understand the impact and plan for it.
225+
226+
1. Download the [encryption diagnostics script](https://gh.io/ghes-encryption-diagnostics). You can use a command like `curl -L -O https://gh.io/ghes-encryption-diagnostics` to download the script.
227+
1. Save the script to the `/data/user/common` directory on the appliance.
228+
1. Follow the instructions at the top of the script and execute it on the appliance. If there are any undecryptable records, they are logged in `/tmp/column_encryption_records_to_be_deleted.log`. Any records logged here means that the system was not able to find the keys for them and hence was not able to decrypt the data in those records.
229+
230+
At this stage, please note that these records will be deleted as part of the process. The script will warn you about the users who will need to re-enroll into 2FA after the upgrade. The impacted users' handles are logged in `/tmp/column_encryption_users_to_have_2fa_disabled.log`. These users will need to be re-enrolled into 2FA.
231+
232+
If the script runs into unexpected issues, you will be prompted to [contact {% data variables.contact.github_support %}](/support/contacting-github-support). Errors related to these issues will be logged in `/tmp/column_encryption_unexpected_errors.log`. If you are in a dire situation and are unable to have users re-enroll into 2FA, [contact {% data variables.contact.github_support %}](/support/contacting-github-support) for help.
233+
234+
### During the upgrade
235+
236+
In case you did not have the opportunity to run the encryption diagnostics script ahead of time, there are mechanisms in the product to help you. The pre-flight checks during the upgrade process will detect undecryptable records and log them in `/tmp/column_encryption_records_to_be_deleted.log`. The sequence will warn you of the users who will need to re-enable 2FA after the upgrade. The impacted users records are logged in `/tmp/column_encryption_users_to_have_2fa_disabled.log`.
237+
238+
If undecryptable records are detected, you will be prompted whether you want to proceed with the upgrade or not. If you proceed, the upgrade process deletes the undecryptable records. Otherwise, the upgrade process will exit.
239+
240+
If you have any questions during the upgrade, you can reach out to {% data variables.contact.github_support %}. Once you have had the time and opportunity to understand the impact, you can retrigger the upgrade.
241+
{% endif %}

content/billing/managing-the-plan-for-your-github-account/connecting-an-azure-subscription.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -80,6 +80,12 @@ For example, you link your Azure subscription to your organization {% ifversion
8080

8181
* You must know your Azure subscription ID. See [Get subscription and tenant IDs in the Azure portal](https://learn.microsoft.com/en-us/azure/azure-portal/get-subscription-tenant-id) in the Microsoft Docs or [contact Azure support](https://azure.microsoft.com/support/).
8282

83+
## Video demonstration of connecting a subscription
84+
85+
To connect an Azure subscription, you'll need appropriate access permissions on both {% data variables.product.product_name %} and the Azure billing portal. This may require coordination between two different people.
86+
87+
To see a demo of the process from beginning to end, see [Billing GitHub consumption through an Azure subscription](https://www.youtube.com/watch?v=Y-f7JKJ4_8Y) on {% data variables.product.company_short %}'s YouTube channel. This video demonstrates the process for an enterprise account. If you're connecting a subscription to an organization account, see "[Connecting your Azure subscription to your organization account](/free-pro-team@latest/billing/managing-the-plan-for-your-github-account/connecting-an-azure-subscription#connecting-your-azure-subscription-to-your-organization-account)."
88+
8389
{% ifversion fpt %}
8490

8591
## Connecting your Azure subscription to your organization account

content/code-security/codeql-cli/codeql-cli-manual/generate-query-help.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ Generate end-user query help from .qhelp files.
3535

3636
### Primary Options
3737

38-
#### `<qhelp|mdhelp|query|dir|suite>...`
38+
#### `<qhelpquerysuite>...`
3939

4040
\[Mandatory] Query help files to render. Each argument is one of:
4141

content/video-transcripts/transcript-billing-github-consumption-through-an-azure-subscription.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Transcript - "Billing GitHub consumption through an Azure subscription"
33
intro: Audio and visual transcript.
44
shortTitle: Billing through Azure
55
allowTitleToDifferFromFilename: true
6-
product_video: 'https://www.youtube.com/watch?v=DAiIhJKCt8s'
6+
product_video: 'https://www.youtube.com/watch?v=Y-f7JKJ4_8Y'
77
topics:
88
- Transcripts
99
versions:
@@ -27,7 +27,9 @@ And finally, if a Microsoft customer has an Azure discount, it will automaticall
2727

2828
If a Microsoft customer also has a Microsoft Azure Consumption Commitment, or MACC, all future GitHub consumption will decrement their MACC as well.
2929

30-
So what GitHub products are eligible for Azure billing? Any GitHub consumption products are eligible today, meaning products that customers pay for based on actual usage, including Copilot for Business, GitHub-hosted actions, larger hosted runners, GitHub Packages and storage, and GitHub Codespaces. Please note that GitHub Enterprise and GitHub Advanced Security are currently not able to be billed through Azure, but are instead invoiced on an annual basis.
30+
So what GitHub products are eligible for Azure billing? Any GitHub consumption products are eligible today, meaning products that customers pay for based on actual usage, including things like GitHub Copilot, GitHub-hosted actions, larger hosted runners, GitHub Packages and storage, and GitHub Codespaces.
31+
32+
Historically, GitHub Enterprise and Advanced Security were only available through an annual license. However, as of August 1, 2024, they are now also available for metered billing through Azure, for additional flexibility and pay-as-you-go pricing. For existing licensed customers, be sure to connect with your GitHub seller to learn more, as certain restrictions may apply.
3133

3234
[A table shows eligibility for Azure billing and MACCs for the products mentioned. In the table, all products eligible for Azure billing are also eligible for MACCs.]
3335

data/release-notes/enterprise-server/3-10/17.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ sections:
55
**MEDIUM:** An attacker could steal sensitive information by exploiting a Cross-Site Scripting vulnerability in the repository transfer feature. This exploitation would require social engineering. GitHub has requested CVE ID [CVE-2024-8770](https://www.cve.org/cverecord?id=CVE-2024-8770) for this vulnerability, which was reported via the [GitHub Bug Bounty program](https://bounty.github.com/).
66
- |
77
**MEDIUM:** An attacker could push a commit with changes to a workflow using a PAT or OAuth app that lacks the appropriate `workflow` scope by pushing a triple-nested tag pointing at the associated commit. GitHub has requested CVE ID [CVE-2024-8263](https://www.cve.org/cverecord?id=CVE-2024-8263) for this vulnerability, which was reported via the [GitHub Bug Bounty program](https://bounty.github.com/).
8+
- |
9+
**HIGH:** A GitHub App installed in organizations could upgrade some permissions from read to write access without approval from an organization administrator. An attacker would require an account with administrator access to install a malicious GitHub App. GitHub has requested [CVE ID CVE-2024-8810](https://www.cve.org/cverecord?id=CVE-2024-8810) for this vulnerability, which was reported via the [GitHub Bug Bounty Program](https://bounty.github.com/). [Updated: 2024-11-07]
810
bugs:
911
- |
1012
For instances deployed on AWS with IMDSv2 enforced, fallback to private IPs was not successful.

0 commit comments

Comments
 (0)