国产 无码 综合区,色欲AV无码国产永久播放,无码天堂亚洲国产AV,国产日韩欧美女同一区二区

Azure動(dòng)手實(shí)驗(yàn) - 使用Azure Data Factory 遷移數(shù)據(jù)

這篇具有很好參考價(jià)值的文章主要介紹了Azure動(dòng)手實(shí)驗(yàn) - 使用Azure Data Factory 遷移數(shù)據(jù)。希望對(duì)大家有所幫助。如果存在錯(cuò)誤或未考慮完全的地方,請(qǐng)大家不吝賜教,您也可以點(diǎn)擊"舉報(bào)違法"按鈕提交疑問(wèn)。

該實(shí)驗(yàn)使用 Azure CosmosDB,這個(gè)實(shí)驗(yàn)的點(diǎn)在于:

1:使用了 cosmicworks 生成了實(shí)驗(yàn)數(shù)據(jù)

2:弄清楚cosmosDB 的 accout Name 與 database id 和 container id 關(guān)系。

3:創(chuàng)建了 ADF 的連接和任務(wù),讓數(shù)據(jù)從 cosmicworks 數(shù)據(jù)庫(kù)的 products 容器,遷移到 cosmicworks數(shù)據(jù)庫(kù)的 flatproducts 容器。

實(shí)驗(yàn)來(lái)自于:練習(xí):使用 Azure 數(shù)據(jù)工廠遷移現(xiàn)有數(shù)據(jù) - Training | Microsoft Learn

Migrate existing data using Azure Data Factory

In Azure Data Factory, Azure Cosmos DB is supported as a source of data ingest and as a target (sink) of data output.

In this lab, we will populate Azure Cosmos DB using a helpful command-line utility and then use Azure Data Factory to move a subset of data from one container to another.

Create and seed your Azure Cosmos DB SQL API account

You will use a command-line utility that creates a?cosmicworks?database and a?products?container at?4,000?request units per second (RU/s). Once created, you will adjust the throughput down to 400 RU/s.

To accompany the products container, you will create a?flatproducts?container manually that will be the target of the ETL transformation and load operation at the end of this lab.

  1. In a new web browser window or tab, navigate to the Azure portal (portal.azure.com).

  2. Sign into the portal using the Microsoft credentials associated with your subscription.

  3. Select?+ Create a resource, search for?Cosmos DB, and then create a new?Azure Cosmos DB SQL API?account resource with the following settings, leaving all remaining settings to their default values:

    Setting Value
    Subscription Your existing Azure subscription
    Resource group Select an existing or create a new resource group
    Account Name Enter a globally unique name
    Location Choose any available region
    Capacity mode Provisioned throughput
    Apply Free Tier Discount Do Not Apply
    Limit the total amount of throughput that can be provisioned on this account Unchecked

    ?? Your lab environments may have restrictions preventing you from creating a new resource group. If that is the case, use the existing pre-created resource group.

  4. Wait for the deployment task to complete before continuing with this task.

  5. Go to the newly created?Azure Cosmos DB?account resource and navigate to the?Keys?pane.

  6. This pane contains the connection details and credentials necessary to connect to the account from the SDK. Specifically:

    1. Record the value of the?URI?field. You will use this?endpoint?value later in this exercise.

    2. Record the value of the?PRIMARY KEY?field. You will use this?key?value later in this exercise.

  7. Close your web browser window or tab.

  8. Start?Visual Studio Code.

    ?? If you are not already familiar with the Visual Studio Code interface, review the?Get Started guide for Visual Studio Code

  9. In?Visual Studio Code, open the?Terminal?menu and then select?New Terminal?to open a new terminal instance.

  10. Install the?cosmicworks?command-line tool for global use on your machine.

    dotnet tool install --global cosmicworks

    ?? This command may take a couple of minutes to complete. This command will output the warning message (*Tool 'cosmicworks' is already installed') if you have already installed the latest version of this tool in the past.

  11. Run cosmicworks to seed your Azure Cosmos DB account with the following command-line options:

    Option Value
    --endpoint The endpoint value you copied earlier in this lab
    --key The key value you coped earlier in this lab
    --datasets product
    cosmicworks --endpoint <cosmos-endpoint> --key <cosmos-key> --datasets product

    ?? For example, if your endpoint is:?https-://dp420.documents.azure.com:443/?and your key is:?fDR2ci9QgkdkvERTQ==, then the command would be:?cosmicworks?--endpoint https://dp420.documents.azure.com:443/ --key fDR2ci9QgkdkvERTQ== --datasets product

  12. Wait for the?cosmicworks?command to finish populating the account with a database, container, and items.

  13. Close the integrated terminal.

  14. Close?Visual Studio Code.

  15. In a new web browser window or tab, navigate to the Azure portal (portal.azure.com).

  16. Sign into the portal using the Microsoft credentials associated with your subscription.

  17. Select?Resource groups, then select the resource group you created or viewed earlier in this lab, and then select the?Azure Cosmos DB account?resource you created in this lab.

  18. Within the?Azure Cosmos DB?account resource, navigate to the?Data Explorer?pane.

  19. In the?Data Explorer, expand the?cosmicworks?database node, expand the?products?container node, and then select?Items.

  20. Observe and select the various JSON items in the?products?container. These are the items created by the command-line tool used in previous steps.

  21. Select the?Scale & Settings?node. In the?Scale & Settings?tab, select?Manual, update the?required throughput?setting from?4000 RU/s?to?400 RU/s?and then?Save?your changes**.

  22. In the?Data Explorer?pane, select?New Container.

  23. In the?New Container?popup, enter the following values for each setting, and then select?OK:

    Setting Value
    Database id Use existing?|?cosmicworks
    Container id flatproducts
    Partition key /category
    Container throughput (autoscale) Manual
    RU/s 400
  24. Back in the?Data Explorer?pane, expand the?cosmicworks?database node and then observe the?flatproducts?container node within the hierarchy.

  25. Return to the?Home?of the Azure portal.

Create Azure Data Factory resource

Now that the Azure Cosmos DB SQL API resources are in place, you will create an Azure Data Factory resource and configure all of the necessary components and connections to perform a one-time data movement from one SQL API container to another to extract data, transform it, and load it to another SQL API container.

  1. Select?+ Create a resource, search for?Data Factory, and then create a new?Azure Data Factory?resource with the following settings, leaving all remaining settings to their default values:

    Setting Value
    Subscription Your existing Azure subscription
    Resource group Select an existing or create a new resource group
    Name Enter a globally unique name
    Region Choose any available region
    Version V2
    Git configuration Configure Git later

    ?? Your lab environments may have restrictions preventing you from creating a new resource group. If that is the case, use the existing pre-created resource group.

  2. Wait for the deployment task to complete before continuing with this task.

  3. Go to the newly created?Azure Data Factory?resource and select?Open Azure Data Factory Studio.

    ?? Alternatively, you can navigate to (adf.azure.com/home), select your newly created Data Factory resource, and then select the home icon.

  4. From the home screen. Select the?Ingest?option to begin the quick wizard to perform a one-time copy data at scale operation and move to the?Properties?step of the wizard.

  5. Starting with the?Properties?step of the wizard, in the?Task type?section, select?Built-in copy task.

  6. In the?Task cadence or task schedule?section, select?Run once now?and then select?Next?to move to the?Source?step of the wizard.

  7. In the?Source?step of the wizard, in the?Source type?list, select?Azure Cosmos DB (SQL API).

  8. In the?Connection?section, select?+ New connection.

  9. In the?New connection (Azure Cosmos DB (SQL API))?popup, configure the new connection with the following values, and then select?Create:

    Setting Value
    Name CosmosSqlConn
    Connect via integration runtime AutoResolveIntegrationRuntime
    Authentication method Account key?|?Connection string
    Account selection method From Azure subscription
    Azure subscription Your existing Azure subscription
    Azure Cosmos DB account name Your existing Azure Cosmos DB account name you chose earlier in this lab
    Database name cosmicworks
  10. Back in the?Source data store?section, within the?Source tables?section, select?Use query.

  11. In the?Table name?list, select?products.

  12. In the?Query?editor, delete the existing content and enter the following query:

    SELECT 
        p.name, 
        p.categoryName as category, 
        p.price 
    FROM 
        products p
  13. Select?Preview data?to test the query's validity. Select?Next?to move to the?Target?step of the wizard.

  14. In the?Target?step of the wizard, in the?Target type?list, select?Azure Cosmos DB (SQL API).

  15. In the?Connection?list, select?CosmosSqlConn.

  16. In the?Target?list, select?flatproducts?and then select?Next?to move to the?Settings?step of the wizard.

  17. In the?Settings?step of the wizard, in the?Task name?field, enter?FlattenAndMoveData.

  18. Leave all remaining fields to their default blank values and then select?Next?to move to the final step of the wizard.

  19. Review the?Summary?of the steps you have selected in the wizard and then select?Next.

  20. Observe the various steps in the deployment. When the deployment has finished, select?Finish.

  21. Close your web browser window or tab.

  22. In a new web browser window or tab, navigate to the Azure portal (portal.azure.com).

  23. Sign into the portal using the Microsoft credentials associated with your subscription.

  24. Select?Resource groups, then select the resource group you created or viewed earlier in this lab, and then select the?Azure Cosmos DB account?resource you created in this lab.

  25. Within the?Azure Cosmos DB?account resource, navigate to the?Data Explorer?pane.

  26. In the?Data Explorer, expand the?cosmicworks?database node, select the?flatproducts?container node, and then select?New SQL Query.

  27. Delete the contents of the editor area.

  28. Create a new SQL query that will return all documents where the?name?is equivalent to?HL Headset:

    SELECT 
        p.name, 
        p.category, 
        p.price 
    FROM
        products p
    WHERE
        p.name = 'HL Headset'
  29. Select?Execute Query.

  30. Observe the results of the query.文章來(lái)源地址http://www.zghlxwxcb.cn/news/detail-439606.html

到了這里,關(guān)于Azure動(dòng)手實(shí)驗(yàn) - 使用Azure Data Factory 遷移數(shù)據(jù)的文章就介紹完了。如果您還想了解更多內(nèi)容,請(qǐng)?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!

本文來(lái)自互聯(lián)網(wǎng)用戶投稿,該文觀點(diǎn)僅代表作者本人,不代表本站立場(chǎng)。本站僅提供信息存儲(chǔ)空間服務(wù),不擁有所有權(quán),不承擔(dān)相關(guān)法律責(zé)任。如若轉(zhuǎn)載,請(qǐng)注明出處: 如若內(nèi)容造成侵權(quán)/違法違規(guī)/事實(shí)不符,請(qǐng)點(diǎn)擊違法舉報(bào)進(jìn)行投訴反饋,一經(jīng)查實(shí),立即刪除!

領(lǐng)支付寶紅包贊助服務(wù)器費(fèi)用

相關(guān)文章

  • 【Microsoft Azure 的1024種玩法】三十四.將本地?cái)?shù)據(jù)文件快速遷移到Azure Blob云存儲(chǔ)最佳實(shí)踐

    【Microsoft Azure 的1024種玩法】三十四.將本地?cái)?shù)據(jù)文件快速遷移到Azure Blob云存儲(chǔ)最佳實(shí)踐

    AzCopy 是一個(gè)Azure提供的一款命令行工具,我們可通過(guò)簡(jiǎn)單命令將本地的數(shù)據(jù)快速?gòu)?fù)制到 Azure Blob 存儲(chǔ)中,本文主要講述了如何通過(guò)AzCopy 工具將本地?cái)?shù)據(jù)文件快速遷移到Azure Blob云存儲(chǔ) 【Microsoft Azure 的1024種玩法】一.一分鐘快速上手搭建寶塔管理面板 【Microsoft Azure 的1024種玩法

    2024年02月09日
    瀏覽(28)
  • 利用 Azure Data Bricks的免費(fèi)資源學(xué)習(xí)云上大數(shù)據(jù)

    利用 Azure Data Bricks的免費(fèi)資源學(xué)習(xí)云上大數(shù)據(jù)

    在這個(gè)數(shù)據(jù)驅(qū)動(dòng)的時(shí)代,大數(shù)據(jù)和云計(jì)算已成為推動(dòng)技術(shù)創(chuàng)新和商業(yè)智能的關(guān)鍵因素。Azure Databricks,作為一個(gè)先進(jìn)的云平臺(tái),為那些渴望深入了解和掌握這些技術(shù)的人們提供了一個(gè)理想的學(xué)習(xí)環(huán)境。我們這里將利用 Azure Databricks 的免費(fèi)資源,探索和學(xué)習(xí)云上大數(shù)據(jù)的奧秘。

    2024年01月19日
    瀏覽(23)
  • 【Microsoft Azure 的1024種玩法】三十二. 利用 AzCopy來(lái)對(duì)Azure Blob Storage中的數(shù)據(jù)進(jìn)行復(fù)制遷移

    【Microsoft Azure 的1024種玩法】三十二. 利用 AzCopy來(lái)對(duì)Azure Blob Storage中的數(shù)據(jù)進(jìn)行復(fù)制遷移

    AzCopy 是一個(gè)命令行實(shí)用工具,可用于向/從存儲(chǔ)帳戶復(fù)制 Blob 或文件,本文將使用AzCopy來(lái)對(duì)Azure Blob Storage之間數(shù)據(jù)復(fù)制遷移 【Microsoft Azure 的1024種玩法】一.一分鐘快速上手搭建寶塔管理面板 【Microsoft Azure 的1024種玩法】二.基于Azure云平臺(tái)的安全攻防靶場(chǎng)系統(tǒng)構(gòu)建 【Microsoft A

    2024年02月04日
    瀏覽(35)
  • [mysql]數(shù)據(jù)遷移之data目錄復(fù)制方法

    [mysql]數(shù)據(jù)遷移之data目錄復(fù)制方法

    1、簡(jiǎn)述: mysql數(shù)據(jù)遷移有多種方式,最常見的就是先把數(shù)據(jù)庫(kù)導(dǎo)出,然后導(dǎo)入新的數(shù)據(jù)庫(kù)。拷貝數(shù)據(jù)目錄data是另外一種方式。 尤其是當(dāng)數(shù)據(jù)庫(kù)啟動(dòng)不了,或者大型數(shù)據(jù)庫(kù)遷移的時(shí)候,可以考慮這個(gè)方式。 2、場(chǎng)景: 從老的mysql( mysqlA )遷移到新的mysql( mysqlB )。mysqlA對(duì)應(yīng)

    2024年02月15日
    瀏覽(20)
  • azure data studio SQL擴(kuò)展插件開發(fā)筆記

    azure data studio SQL擴(kuò)展插件開發(fā)筆記

    調(diào)試擴(kuò)展,在visual studio code中安裝插件即可 然后visual studio code打開進(jìn)行修改運(yùn)行即可 image.png 運(yùn)行后自動(dòng)打開auzre data studio了, 下面是我開發(fā)的擴(kuò)展, image.png 下面是我的存儲(chǔ)過(guò)程轉(zhuǎn)sql的包 https://github.com/lozn00/AzureSQLProcConvertSQL/raw/master/StoredProcedureConverter-0.0.1.vsix 官網(wǎng)的介紹

    2024年02月10日
    瀏覽(36)
  • 自定義數(shù)據(jù)集使用llama_factory微調(diào)模型并導(dǎo)入ollama

    自定義數(shù)據(jù)集使用llama_factory微調(diào)模型并導(dǎo)入ollama

    本文所有操作均在linux系統(tǒng)下完成 參考github的安裝命令 參考github,使用以下命令啟動(dòng)LLaMA Factory web頁(yè)面:(web ui界面只支持單卡運(yùn)行,如需多卡微調(diào)請(qǐng)參考github相關(guān)部分) 此外可以選擇模型下載源,這里推薦國(guó)內(nèi)用戶使用魔搭社區(qū)下載渠道。 ?成功啟動(dòng)后會(huì)進(jìn)入web操作界面:

    2024年04月26日
    瀏覽(25)
  • 【新知實(shí)驗(yàn)室】TRTC騰訊實(shí)時(shí)音視頻動(dòng)手實(shí)驗(yàn)

    【新知實(shí)驗(yàn)室】TRTC騰訊實(shí)時(shí)音視頻動(dòng)手實(shí)驗(yàn)

    https://cloud.tencent.com/document/product/647/16788 應(yīng)用 TRTC 通過(guò)應(yīng)用的形式來(lái)管理不同的業(yè)務(wù)或項(xiàng)目。您可以在 TRTC 控制臺(tái) 給不同的業(yè)務(wù)或項(xiàng)目分別創(chuàng)建不同的應(yīng)用,從而實(shí)現(xiàn)業(yè)務(wù)或項(xiàng)目數(shù)據(jù)的隔離。每個(gè)騰訊云賬號(hào)最多可以創(chuàng)建100個(gè) TRTC 應(yīng)用。 SDKAppID SDKAppID(應(yīng)用標(biāo)識(shí)/應(yīng)用 ID)是騰

    2024年02月01日
    瀏覽(30)
  • Azure Machine Learning - 使用自己的數(shù)據(jù)與 Azure OpenAI 模型對(duì)話

    Azure Machine Learning - 使用自己的數(shù)據(jù)與 Azure OpenAI 模型對(duì)話

    在本文中,可以將自己的數(shù)據(jù)與 Azure OpenAI 模型配合使用。 對(duì)數(shù)據(jù)使用 Azure OpenAI 模型可以提供功能強(qiáng)大的對(duì)話 AI 平臺(tái),從而實(shí)現(xiàn)更快、更準(zhǔn)確的通信。 關(guān)注TechLead,分享AI全維度知識(shí)。作者擁有10+年互聯(lián)網(wǎng)服務(wù)架構(gòu)、AI產(chǎn)品研發(fā)經(jīng)驗(yàn)、團(tuán)隊(duì)管理經(jīng)驗(yàn),同濟(jì)本復(fù)旦碩,復(fù)旦機(jī)器

    2024年02月04日
    瀏覽(21)
  • AWS動(dòng)手實(shí)驗(yàn) - 快速搭建郵件營(yíng)銷平臺(tái)

    AWS動(dòng)手實(shí)驗(yàn) - 快速搭建郵件營(yíng)銷平臺(tái)

    通過(guò) Amazon SES 作為發(fā)送郵件平臺(tái),整合開源郵件管理軟件 listmonk 提供簡(jiǎn)潔的 UI 界面, 助力外貿(mào)、廣告、物流等行業(yè)客戶進(jìn)行營(yíng)銷拓客,并帶來(lái) SES 移出沙盒最佳實(shí)踐分享。 該實(shí)驗(yàn)資料 https://dev.amazoncloud.cn/activity/activityDetail?id=6375ea3998a34777881a46bd 該產(chǎn)品文檔 listmonk電子郵件營(yíng)銷

    2024年02月11日
    瀏覽(94)
  • Azure - 機(jī)器學(xué)習(xí):使用 Apache Spark 進(jìn)行交互式數(shù)據(jù)整理

    Azure - 機(jī)器學(xué)習(xí):使用 Apache Spark 進(jìn)行交互式數(shù)據(jù)整理

    關(guān)注TechLead,分享AI全維度知識(shí)。作者擁有10+年互聯(lián)網(wǎng)服務(wù)架構(gòu)、AI產(chǎn)品研發(fā)經(jīng)驗(yàn)、團(tuán)隊(duì)管理經(jīng)驗(yàn),同濟(jì)本復(fù)旦碩,復(fù)旦機(jī)器人智能實(shí)驗(yàn)室成員,阿里云認(rèn)證的資深架構(gòu)師,項(xiàng)目管理專業(yè)人士,上億營(yíng)收AI產(chǎn)品研發(fā)負(fù)責(zé)人。 數(shù)據(jù)整理已經(jīng)成為機(jī)器學(xué)習(xí)項(xiàng)目中最重要的步驟之一。

    2024年02月08日
    瀏覽(28)

覺得文章有用就打賞一下文章作者

支付寶掃一掃打賞

博客贊助

微信掃一掃打賞

請(qǐng)作者喝杯咖啡吧~博客贊助

支付寶掃一掃領(lǐng)取紅包,優(yōu)惠每天領(lǐng)

二維碼1

領(lǐng)取紅包

二維碼2

領(lǐng)紅包