ãAWS CDKãAmazon Dynamodbã¨Amazon OpenSearch Serverless ã®zero-ETL integrationã AWS CDK ã§æ§ç¯ãã
- åæ¸ã
- 注æç¹
- åèãªã³ã¯
- ç°å¢
- å ¨ä½å
- 0. äºåæºå
- 1. CDKã®ããã¸ã§ã¯ãä½æ
- 2. S3
- 3. DynamoDB
- 4. IAM Role
- 5. Amazon OpenSearch Service
- 6. IAM Policy
- 7. Pipeline
- ãããã¤ã¨ãã¹ã
- åé¤
- ã¾ã¨ã
- CDKã®å®ç¾©å ¨ä½
åæ¸ã
æ¨å¹´æ«ã«å ¬éããããDynamoDBã¨OpenSearch Serviceã®zero-ETL integrationã AWS CDK ã§æ§ç¯ããä¾ã§ãã
æ¬è¨äºã¯ããã¡ãã®CLIæé ã®CDKãã¼ã¸ã§ã³ã§ãã
CLIã¨ã»ã¼åãã§ãããCDKã§ã¯ãªã½ã¼ã¹éã«ä¾åé¢ä¿ãããå ´åã«ã¯addDependencyã¡ã½ããã使ã£ã¦æ示ãã¦ãããå¿ è¦ãããã¾ããä½æãããªã½ã¼ã¹ã®IDãå ã«policyããã£ããç¸ãããæã«ã¯ç¹ã«éè¦ã§ãã ã¾ããpipelineã®å¦çå®ç¾©(yml)é¨åã¯ç¾å¨æååã§æ¸¡ããã¨ã«ãªã£ã¦ãã¾ããCLIã¨TypeScriptã§ã¯åãè¾¼ã¿æ¹ãç°ãªãã®ã§ããã®é¨åã«ã注æãã¦ãã ããã
â»å ã«CLIçã®æé ã試ãã¦ã¿ããã¨ããå§ããã¾ããCLIçã®ãªã½ã¼ã¹éã®é¢ä¿ãç解ãã¦ããã°ãä¾åé¢ä¿ã®ç解ã¯é£ãããªãã¨æãã¾ãã
注æç¹
ãªã½ã¼ã¹ã¯èª²éããã¾ãã ãã¹ããçµãã£ããåé¤ãã¦ããã¾ãããã以ä¸ã®æé ãå®è¡ãã¦çºçããåé¡ã«ã¤ãã¦ãçè ã¯ä¸åã®è²¬ä»»ãåããã¨ãã§ãã¾ãããèªå·±è²¬ä»»ã§ãé¡ããã¾ãã
åèãªã³ã¯
AWSå ¬å¼ã®ç´¹ä»è¨äºã§ãã
Amazon DynamoDB の Amazon OpenSearch Service とのゼロ ETL 統合が利用可能になりました | Amazon Web Services ブログ
å ¬å¼ãã¥ã¼ããªã¢ã«ã§ãããã¡ãã¯GUIãã¼ã¹ã§ããæ¬è¨äºã§ã¯ãä¸ã®collection(serverless)çã®ãªã½ã¼ã¹ããAWS CLIã使ã£ã¦æ§ç¯ãã¾ãã
Tutorial: Ingesting data into a domain using Amazon OpenSearch Ingestion - Amazon OpenSearch Service
DynamoDB zero-ETL integration with Amazon OpenSearch Service - Amazon DynamoDB
ç°å¢
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Sonoma | 14.4.1 |
AWS CLI | 2.15.34 |
AWS CDK | 2.145.0 |
awscurl | 0.33 |
å ¨ä½å
以ä¸ã®ãªã½ã¼ã¹ãä½ãã¾ãã
- Amazon S3
- Amazon DynamoDBã®Table
- Amazon OpenSearch Service ã® collection
- 3種ã®ããªã·ã¼
- Data access policies
- Encryption policies
- Network policies
- 3種ã®ããªã·ã¼
- Pipeline
- IAM Role
- IAM Policy : OpenSearchã¨DynamoDBã¸ã®ã¢ã¯ã»ã¹æ¨©
Pipelineã¯ä»¥ä¸ã®åãããã¾ãã
- DynamoDBã®ç£è¦(ãã¼ã¿ãæå ¥ããããã¨ãæç¥)
- OpenSearchã¸ã®ãã¼ã¿æå ¥/åé¤/æ´æ°
- S3ã¸ããã¯ã¢ãããªã©ãã¢ãããã¼ã
ã¾ããAmazon OpenSearch Serviceã¯ãªã½ã¼ã¹ãã¼ã¹ã®ããªã·ã¼ãæã¡ã¾ããIAM Roleã§Amazon OpenSearch Serviceã¸ã®ã¢ã¯ã»ã¹ã許å¯ããã ãã§ã¯ãã¡ã§ãAmazon OpenSearch Serviceã® Data access policies å´ã§ãIAM Roleã«å¯¾ãã¦è¨±å¯ãåºãå¿ è¦ãããã¾ã(å¾è¿°)ã
0. äºåæºå
AWS CDKãå©ç¨ã§ããããã«ãã¦ããã¾ãã ã¾ããCDKãå®è¡ããã¦ã¼ã¶ã«å¿ è¦ãªæ¨©éãã¤ãã¦ããã¾ãã
1. CDKã®ããã¸ã§ã¯ãä½æ
mkdir zero-etl-dynamodb-aoss cd $_ cdk init app --language typescript
lib/zero-etl-dynamodb-aoss-stack.ts ã«ã³ã¼ããæ¸ãã¦ããã¾ãã
importé¨åãªã©ã¯æå¾ã«ã¾ã¨ããã³ã¼ããè¨è¼ããã®ã§ããã¡ããåèã«ãã¦ãã ããã
2. S3
cdkãdestroyããæã«åé¤ãããããã«ãremovalPolicy(ãã±ãç ´å£)ã¨autoDeleteObjects(ä¸èº«ç ´å£)ãè¨å®ãã¦ããã¾ãã
const s3bucket = new Bucket(this, 'S3Bucket', { bucketName: 'ingestion-dynamodb', removalPolicy: cdk.RemovalPolicy.DESTROY, autoDeleteObjects: true, });
3. DynamoDB
ãã®DynamoDBã«æå ¥ãããã¼ã¿ãPipelineã«ãã£ã¦OpenSearchã«èªåã§æå ¥ãããäºå®ã§ãã
partitionKeyã¨ãã¦æååã®nameãsortKeyã¨ãã¦æ°å¤ã®ageã使ããã¨ã«ãã¾ãã
両è
ãåãããã¨ä¸æã«ãªãã¾ããOpenSearchã®idã¯name:age
ã¨ããå½¢ã«ãªãæ³å®ã§ãã
pointInTimeRecoveryã¨streamã®è¨å®ã¯å¿ é ã§ã(å³å¯ã«ã¯å¾è ã ãã§ã以éã®å®é¨ã¯ã§ãã¾ã)ããã¡ãã®è¨å®ã«ãã£ã¦ãDynamoDBã®å 容ãpipelineãæ¤ç¥ãã¦OpenSearchã«æå ¥ã§ããããã«ãªãã¾ãã
cdkã§stackãåé¤ããéã«dynamoDBã¯ç ´å£ãããã®ã§ãremovalPolicyãã¤ãã¦ããã¾ãã
ã¾ããreadCapacityã¨writeCapacityãæå°ã®1ã«è¨å®ãã¾ãã(ããªãã¦ãããã§ã)
const table = new Table(this, 'DynamoDBTable', { tableName: 'ingestion-table', partitionKey: {name: 'name', type: AttributeType.STRING}, sortKey: {name: 'age', type: AttributeType.NUMBER}, pointInTimeRecovery: true, stream: StreamViewType.NEW_IMAGE, readCapacity: 1, writeCapacity: 1, removalPolicy: cdk.RemovalPolicy.DESTROY });
4. IAM Role
Pipelineã®Roleãå
ã«ä½ã£ã¦ããã¾ããOpenSearchã®Data Access Policyã«å¯¾ãã¦ããã®Roleã®Arnãæå®ãã¦ã¢ã¯ã»ã¹è¨±å¯ãåºãããããã§ãã
Pipelineç¨ã®Roleãªã®ã§ãtrust policyã®å¯¾è±¡ãosis-pipelines.amazonaws.com
ã«ãã¦ããã¾ãã
const pipelineRole = new Role(this, 'pipelineRole', { roleName: 'PipelineRole', assumedBy: new ServicePrincipal('osis-pipelines.amazonaws.com') });
5. Amazon OpenSearch Service
3種ã®ããªã·ã¼ã¨ãcollectionæ¬ä½ãä½æãã¾ãã ã¾ãã2024å¹´6æ18æ¥ç¾å¨ãL2ã³ã³ã¹ãã©ã¯ã¿ãç¡ããããªã®ã§ãL1(Cfnãé ã«ã¤ãã¦ããã¯ã©ã¹)ã使ã£ã¦ãã¾ãã
Data Access Policy
ãªã½ã¼ã¹ããªã·ã¼ã§ããå ã»ã©ä½ã£ãRoleã«å ããCLIã®ã¦ã¼ã¶ãprincipalã«å ¥ãã¦ããã¨è¯ãã§ã(ãã¹ãæã«CLIããOpenSearchã«ãªã¯ã¨ã¹ãã§ãã¾ã)ã
CLIã¦ã¼ã¶åãããããªããã°ãCLI㧠aws sts get-caller-identity
ãå®è¡ããã¨è¯ãã§ãã
const cliUser = User.fromUserName(this, 'existingUser', 'local-cli-user'); const dataAccessPolicy = new CfnAccessPolicy(this, "OpenSearchAccessPolicy", { name: "cdk-access-policy", type: "data", policy: JSON.stringify([ { "Rules": [ { "Resource": [`index/${collection.name}/*`], "Permission": [ "aoss:CreateIndex", "aoss:UpdateIndex", "aoss:DescribeIndex", "aoss:ReadDocument", "aoss:WriteDocument" ], "ResourceType": "index" } ], "Principal": [ cliUser.userArn, pipelineRole.roleArn] } ]), });
Encryption Policy
collectionã®ãã¼ã¿æå·åã®è¨å®ã®ãããEncryption Policyãå®ç¾©ãã¾ããResourceé ç®ã§å ã»ã©å®ç¾©ããcollection.nameãå©ç¨ãã¦ãã¾ãã
const encryptionPolicy = new CfnSecurityPolicy(this, 'OpenSearchEncryptionPolicy', { name: 'encryption-policy', type: 'encryption', policy: JSON.stringify({ "Rules": [ { "ResourceType": "collection", "Resource": [`collection/${collection.name}`] } ], "AWSOwnedKey": true }), });
encryption policyã¯ãcollectionä½æåã«åå¨ãã¦ããå¿ è¦ãããã¾ãã ãã®ãããaddDependencyã¡ã½ãããå®ç¾©ãã¦ä¾åé¢ä¿ãæ示ãã¦ããã¾ãããã
// NOTE Collectionã¯encryptionPolicyã«ä¾åãã¦ããã
collection.addDependency(encryptionPolicy)
Network Policy
å®é¨ãããããããpublicã¢ã¯ã»ã¹ãã§ããããã«ãã¦ããã¾ãã
const networkPolicy = new CfnSecurityPolicy(this, 'OpenSearchNetworkPolicy', { name: "network-policy", type: "network", policy: JSON.stringify([ { "Rules": [ { "ResourceType": "dashboard", "Resource": [ `collection/${collection.name}`] }, { "ResourceType": "collection", "Resource": [ `collection/${collection.name}`] }, ], "AllowFromPublic": true, } ]) });
6. IAM Policy
pipelineã®Roleã«ä»ä¸ããPolicyãä½æãã¾ãã(ä¸è¨ã§ä½æããcollectionã®idãå¿ è¦ãªã®ã§ãpolicyã¯ãã®ã¿ã¤ãã³ã°ã¾ã§ä½ããã«ãã¾ããã)
å ¬å¼tutorialã«å¾ã£ã¦ããå³ããã«è¨å®ãã¦ãã¾ãããå®é¨ç¨ãªããã£ã¨ã©ãã§ãè¯ãæ°ããã¾ãã
const pipelinePolicy = new Policy(this, 'pipelinePolicy', { policyName: 'pipelinePolicy', statements: [ new PolicyStatement({ effect: Effect.ALLOW, actions: [ "dynamodb:DescribeTable", "dynamodb:DescribeContinuousBackups", "dynamodb:ExportTableToPointInTime" ], resources: [`${table.tableArn}`] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "dynamodb:DescribeExport" ], resources: [`${table.tableArn}/export/*`] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:GetShardIterator" ], resources: [`${table.tableArn}/stream/*`] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "s3:GetObject", "s3:AbortMultipartUpload", "s3:PutObject", "s3:PutObjectAcl" ], resources: [ `${s3bucket.bucketArn}/*` ] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "aoss:BatchGetCollection", "aoss:APIAccessAll" ], resources: [ `${collection.attrArn}` ] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "aoss:CreateSecurityPolicy", "aoss:GetSecurityPolicy", "aoss:UpdateSecurityPolicy" ], resources: ['*'], conditions: { StringEquals: { "aoss:collection": collection.name } } }), ] });
(éè¦) policyã¯Roleã«ã¢ã¿ãããã¦ããã¾ã
pipelinePolicy.attachToRole(pipelineRole)
7. Pipeline
pipelineãå®ç¾©ãã¾ãã
ETLå¦çã®å®ç¾©(pipelineConfigurationBody)ã¯ãæ®å¿µãªããæååã¨ãã¦å®ç¾©ããããç¡ãããã§ãã ãã£ããTypeScriptãªã®ã«... L2ã®ã³ã³ã¹ãã©ã¯ã¿ãåºãã®ãå¾ ã¡ã¾ãããã
éå½¢ã¨ãªãæååã欲ãããã°ã以ä¸ã§åå¾ãã¦ãã ããã詳ããã¯ãèªåãæ¸ããCLIçã®è¨äºã«è¨è¼ãã¦ããã¾ãã
aws osis get-pipeline-blueprint --blueprint-name AWS-DynamoDBChangeDataCapturePipeline --query Blueprint.PipelineConfigurationBody --output text > blueprint.yml
ã³ã¼ãã¯ä»¥ä¸ã®ããã«ãªãã¾ãã
CLIçã¨æ¯ã¹ã¦indexã¾ããã®ã¨ã¹ã±ã¼ãã®æ¸ãæ¹ãå¤æ´ãã¦ãã¾ãã yamlããã®ããã«æ¸ãã®ã¯ãã¹ãçºçããããå³ããã®ã§ãobjectã§æ¸ããå¾ã«å¤æãããªã©ããæ¹ãè¯ãããã§ã(æªæ¤è¨)
const pipelineConfiguration = ` version: "2" dynamodb-pipeline: source: dynamodb: acknowledgments: true tables: - table_arn: "${table.tableArn}" stream: start_position: "LATEST" export: s3_bucket: "${s3bucket.bucketName}" s3_region: "${this.region}" s3_prefix: "opensearch-export/" aws: sts_role_arn: "${pipelineRole.roleArn}" region: "${this.region}" sink: - opensearch: hosts: - "${collection.attrCollectionEndpoint}" index: '\${getMetadata("table_name")}' index_type: "custom" normalize_index: true document_id: '\${getMetadata("primary_key")}' action: '\${getMetadata("opensearch_action")}' document_version: '\${getMetadata("document_version")}' document_version_type: "external" aws: sts_role_arn: "${pipelineRole.roleArn}" region: "${this.region}" serverless: true dlq: s3: bucket: "${s3bucket.bucketName}" key_path_prefix: "dynamodb-pipeline/dlq" region: "${this.region}" sts_role_arn: "${pipelineRole.roleArn}" `; const pipeline = new CfnPipeline(this, "pipeline", { pipelineConfigurationBody: pipelineConfiguration, pipelineName: 'serverless-ingestion', minUnits: 1, maxUnits: 2, });
ãããã¤ã¨ãã¹ã
ãããã¤ãã¾ãã確èªç»é¢ãåºãããyãæ¼ãã¦é²ã¿ã¾ãããã
cdk deploy
å®äºããããDynamoDBã«ãã¼ã¿ãæå ¥ãã¦ã¿ã¾ãã
TABLE_NAME=ingestion-table aws dynamodb put-item \ --table-name $TABLE_NAME \ --item '{"name": {"S": "saki"}, "age": {"N": "16"}, "height": {"N": "152"}}' aws dynamodb put-item \ --table-name $TABLE_NAME \ --item '{"name": {"S": "temari"}, "age": {"N": "15"}, "height": {"N": "162"}}' aws dynamodb put-item \ --table-name $TABLE_NAME \ --item '{"name": {"S": "kotone"}, "age": {"N": "15"}, "height": {"N": "156"}}'
OpenSearchã確èªãã¾ããåæ ãããã¾ã§ãå°ãæéããããããããã¾ããã
export AWS_DEFAULT_REGION='ap-northeast-1' COLLECTION_NAME=ingestion-collection HOST=$(aws opensearchserverless batch-get-collection --names $COLLECTION_NAME --query 'collectionDetails[].collectionEndpoint' --output text) && echo $HOST awscurl --service aoss --region $AWS_DEFAULT_REGION -X GET ${HOST}/_cat/indices awscurl --service aoss --region $AWS_DEFAULT_REGION -X GET ${HOST}/${TABLE_NAME}/_search | jq .
DynamoDBã«å°å ¥ãããã¼ã¿ããOpenSearchã«åæ ããã¦ããã°æåã§ãï¼
åé¤
以ä¸ã®ã³ãã³ãã§åé¤ãã¾ããS3ãDynamoDBããCDKã®ã¹ã¿ãã¯ã¨ã¨ãã«åé¤ãããè¨å®ã«ãªã£ã¦ããã®ã§åé¤ãããã¯ãã§ããä»åã¯ãã°ãè¨å®ãã¦ãªãã®ã§ãä½ãæ®ããªãã¨æãã¾ãã(ä½ããªã½ã¼ã¹ãæ®ã£ã¦ãããæãã¦ãã ãã)
cdk destroy
ã¾ã¨ã
Amazon Dynamodbã¨Amazon OpenSearch Serviceã®zero-ETL integrationãAWS CDKã§æ§ç¯ããæé ãã¾ã¨ãã¾ããã
pipelineã®yamlãã¾ã æååã§ããæå®ã§ããªãã®ãå°ãä¸ä¾¿ã§ããã L2ã®ã³ã³ã¹ãã©ã¯ã¿ãåºãã°ãobjectã§æå®ã§ãã¦TypeScriptã®åã®è£å©ãåããããã®ããªã¨æã£ã¦ãã¾ãã
CDKã®å®ç¾©å ¨ä½
以ä¸ã¯ã³ã¼ãå ¨ä½ã§ãã
import * as cdk from 'aws-cdk-lib'; import { AttributeType, StreamViewType, Table } from 'aws-cdk-lib/aws-dynamodb'; import { Effect, Policy, PolicyStatement, Role, ServicePrincipal, User } from 'aws-cdk-lib/aws-iam'; import { CfnAccessPolicy, CfnCollection, CfnSecurityPolicy } from 'aws-cdk-lib/aws-opensearchserverless'; import { CfnPipeline } from 'aws-cdk-lib/aws-osis'; import { Bucket } from 'aws-cdk-lib/aws-s3'; import { Construct } from 'constructs'; export class ZeroEtlDynamodbAossStack extends cdk.Stack { constructor(scope: Construct, id: string, props?: cdk.StackProps) { super(scope, id, props); const s3bucket = new Bucket(this, 'S3Bucket', { bucketName: 'ingestion-dynamodb', removalPolicy: cdk.RemovalPolicy.DESTROY, autoDeleteObjects: true, }); const table = new Table(this, 'DynamoDBTable', { tableName: 'ingestion-table', partitionKey: {name: 'name', type: AttributeType.STRING}, sortKey: {name: 'age', type: AttributeType.NUMBER}, readCapacity: 1, writeCapacity: 1, pointInTimeRecovery: true, stream: StreamViewType.NEW_IMAGE, removalPolicy: cdk.RemovalPolicy.DESTROY }); // Pieplineç¨ã®Role const pipelineRole = new Role(this, 'pipelineRole', { roleName: 'PipelineRole', assumedBy: new ServicePrincipal('osis-pipelines.amazonaws.com'), }); // AOSS Collection const collection = new CfnCollection(this, "OpenSearchCollection", { name: "ingestion-collection", type: "SEARCH", standbyReplicas: "DISABLED", }); // AOSS Data Access Policy const cliUser = User.fromUserName(this, 'existingUser', 'local-cli-user'); const dataAccessPolicy = new CfnAccessPolicy(this, "OpenSearchAccessPolicy", { name: "access-policy", type: "data", policy: JSON.stringify([ { "Rules": [ { "Resource": [`index/${collection.name}/*`], "Permission": [ "aoss:CreateIndex", "aoss:UpdateIndex", "aoss:DescribeIndex", "aoss:ReadDocument", "aoss:WriteDocument" ], "ResourceType": "index" } ], "Principal": [ cliUser.userArn, pipelineRole.roleArn] } ]), }); // AOSS Encryption Policy const encryptionPolicy = new CfnSecurityPolicy(this, 'OpenSearchEncryptionPolicy', { name: 'encryption-policy', type: 'encryption', policy: JSON.stringify({ "Rules": [ { "ResourceType": "collection", "Resource": [`collection/${collection.name}`] } ], "AWSOwnedKey": true }), }); // NOTE Collectionã¯encryptionPolicyã«ä¾åãã¦ããã collection.addDependency(encryptionPolicy) // AOSS Network Policy const networkPolicy = new CfnSecurityPolicy(this, 'OpenSearchNetworkPolicy', { name: "network-policy", type: "network", policy: JSON.stringify([ { "Rules": [ { "ResourceType": "dashboard", "Resource": [ `collection/${collection.name}`] }, { "ResourceType": "collection", "Resource": [ `collection/${collection.name}`] }, ], "AllowFromPublic": true, } ]) }); const pipelinePolicy = new Policy(this, 'pipelinePolicy', { policyName: 'pipelinePolicy', statements: [ new PolicyStatement({ effect: Effect.ALLOW, actions: [ "dynamodb:DescribeTable", "dynamodb:DescribeContinuousBackups", "dynamodb:ExportTableToPointInTime" ], resources: [`${table.tableArn}`] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "dynamodb:DescribeExport" ], resources: [`${table.tableArn}/export/*`] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:GetShardIterator" ], resources: [`${table.tableArn}/stream/*`] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "s3:GetObject", "s3:AbortMultipartUpload", "s3:PutObject", "s3:PutObjectAcl" ], resources: [ `${s3bucket.bucketArn}/*` ] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "aoss:BatchGetCollection", "aoss:APIAccessAll" ], resources: [ `${collection.attrArn}` ] }), new PolicyStatement({ effect: Effect.ALLOW, actions: [ "aoss:CreateSecurityPolicy", "aoss:GetSecurityPolicy", "aoss:UpdateSecurityPolicy" ], resources: ['*'], conditions: { StringEquals: { "aoss:collection": collection.name } } }), ] }); pipelinePolicy.attachToRole(pipelineRole) const pipelineConfiguration = ` version: "2" dynamodb-pipeline: source: dynamodb: acknowledgments: true tables: - table_arn: "${table.tableArn}" stream: start_position: "LATEST" export: s3_bucket: "${s3bucket.bucketName}" s3_region: "${this.region}" s3_prefix: "opensearch-export/" aws: sts_role_arn: "${pipelineRole.roleArn}" region: "${this.region}" sink: - opensearch: hosts: - "${collection.attrCollectionEndpoint}" index: '\${getMetadata("table_name")}' index_type: "custom" normalize_index: true document_id: '\${getMetadata("primary_key")}' action: '\${getMetadata("opensearch_action")}' document_version: '\${getMetadata("document_version")}' document_version_type: "external" aws: sts_role_arn: "${pipelineRole.roleArn}" region: "${this.region}" serverless: true dlq: s3: bucket: "${s3bucket.bucketName}" key_path_prefix: "dynamodb-pipeline/dlq" region: "${this.region}" sts_role_arn: "${pipelineRole.roleArn}" `; const pipeline = new CfnPipeline(this, "pipeline", { pipelineConfigurationBody: pipelineConfiguration, pipelineName: 'serverless-ingestion', minUnits: 1, maxUnits: 2, }); pipeline.node.addDependency(pipelinePolicy); pipeline.node.addDependency(collection); } }
ãAWS CLIãAmazon DynamoDBã¨Amazon OpenSearch Serverlessã®zero-ETL integrationã AWS CLI ã§æ§ç¯ãã
- åæ¸ã
- 注æç¹
- åèãªã³ã¯
- ç°å¢
- å ¨ä½å
- 0. äºåæºå
- 1. S3ã¨DynamoDBã®ä½æ
- 2. Pipelineã®Roleãä½æ
- 3. Amazon OpenSearch Collectionã®ä½æ
- 4. Policyã®ä½æ
- 5. Pipelineã®ä½æ
- 6. ãã¹ãå®è¡
- ãªã½ã¼ã¹ã®åé¤
- ä½è«: ChatGPT 㨠AWS CLI
- ã¾ã¨ã
åæ¸ã
2023å¹´ã®å¹´æ«ã«å ¬éããããDynamoDBã¨OpenSearch Serviceã®zero-ETL integrationã AWS CLI ã§æ§ç¯ããæé ãè¨é²ãã¦ããã¾ãã zero-ETL integrationã¯ãDynamoDBã«æå ¥ãããã¼ã¿ãOpenSearch Serviceã«åæãããä»çµã¿ã§ãã ã¤ãã³ããæ¾ã£ã¦Lambdaã§OpenSearchã«æå ¥ããå¦çãèªåã§æ¸ãå¿ è¦ãç¡ããªããããã³ã¼ãã£ã³ã°ã®é¢ã§ããªã½ã¼ã¹ç®¡çã®é¢ã§ã便å©ã«ãªãã¯ãã§ãã
以ä¸ã®ä½æ¥ã¯å ¨ã¦ã³ããã§å®è¡ã§ãã¾ãã(â»åé¤æé ãè¨è¼ãã¦ããã¾ã)
注æç¹
ãªã½ã¼ã¹ã¯èª²éããã¾ãã ãã¹ããçµãã£ããåé¤ãã¦ããã¾ãããã以ä¸ã®æé ãå®è¡ãã¦çºçããåé¡ã«ã¤ãã¦ãçè ã¯ä¸åã®è²¬ä»»ãåããã¨ãã§ãã¾ãããèªå·±è²¬ä»»ã§ãé¡ããã¾ãã
åèãªã³ã¯
AWSå ¬å¼ã®ç´¹ä»è¨äºã§ãã
Amazon DynamoDB の Amazon OpenSearch Service とのゼロ ETL 統合が利用可能になりました | Amazon Web Services ブログ
å ¬å¼ãã¥ã¼ããªã¢ã«ã§ãããã¡ãã¯GUIãã¼ã¹ã§ããæ¬è¨äºã§ã¯ãä¸ã®collection(serverless)çã®ãªã½ã¼ã¹ããAWS CLIã使ã£ã¦æ§ç¯ãã¾ãã
Tutorial: Ingesting data into a domain using Amazon OpenSearch Ingestion - Amazon OpenSearch Service
DynamoDB zero-ETL integration with Amazon OpenSearch Service - Amazon DynamoDB
AWS CLIã®åºæ¬ã¨ãªãå®è¡æé ã«ã¤ãã¦ã以ä¸ã®ã·ãªã¼ãºãåèã«ããã¦ããã ãã¦ãã¾ãã
ç°å¢
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Sonoma | 14.4.1 |
AWS CLI | 2.15.34 |
awscurl | 0.33 |
å ¨ä½å
以ä¸ã®ãªã½ã¼ã¹ãä½ãã¾ãã
- Amazon S3
- Amazon DynamoDBã®Table
- Amazon OpenSearch Service ã® collection
- 3種ã®ããªã·ã¼
- Data access policies
- Encryption policies
- Network policies
- 3種ã®ããªã·ã¼
- Pipeline
- IAM Role
- IAM Policy : OpenSearchã¨DynamoDBã¸ã®ã¢ã¯ã»ã¹æ¨©
Pipelineã¯ä»¥ä¸ã®åãããã¾ãã
- DynamoDBã®ç£è¦(ãã¼ã¿ãæå ¥ããããã¨ãæç¥)
- OpenSearchã¸ã®ãã¼ã¿æå ¥/åé¤/æ´æ°
- S3ã¸ããã¯ã¢ãããªã©ãã¢ãããã¼ã
ã¾ããAmazon OpenSearch Serviceã¯ãªã½ã¼ã¹ãã¼ã¹ã®ããªã·ã¼ãæã¡ã¾ãã IAM Roleã§Amazon OpenSearch Serviceã¸ã®ã¢ã¯ã»ã¹ã許å¯ããã ãã§ã¯ä¸ååã§ãAmazon OpenSearch Serviceã® Data access policies ã§IAM Roleã«å¯¾ãã¦è¨±å¯ãåºãå¿ è¦ãããã¾ã(å¾è¿°)ã
0. äºåæºå
AWS CLIãå©ç¨ã§ããããã«ãã¦ããã¾ãã ã¾ããCLIãå®è¡ããã¦ã¼ã¶ã«å¿ è¦ãªæ¨©éãã¤ãã¦ããã¾ãã
ä½æãããªã½ã¼ã¹åãªã©ããå¤æ°ã§å®ç¾©ãã¦ããã¾ãã以ä¸ãã·ã§ã«ã§å®è¡ããå¾åã®æé ã«é²ãã§ãã ããã
export AWS_DEFAULT_REGION='ap-northeast-1' COLLECTION_NAME=ingestion-collection PIPELINE_NAME=serverless-ingestion TABLE_NAME=ingestion-table BUCKET_NAME="ingestion-dynamodb" BUCKET_ARN=arn:aws:s3:::${BUCKET_NAME} && echo $BUCKET_ARN PATH_PREFIX1="opensearch-export" PATH_PREFIX2="dynamodb-pipeline" IAM_POLICY_NAME=pipeline-policy IAM_ROLE_NAME=PipelineRole ACCOUNT_ID=$(aws sts get-caller-identity --query 'Account' --output text) && echo $ACCOUNT_ID COLLECTION_ARN="arn:aws:es:${AWS_DEFAULT_REGION}:${ACCOUNT_ID}:domain/${COLLECTION_NAME}" && echo $COLLECTION_ARN
1. S3ã¨DynamoDBã®ä½æ
S3ã®Bucketãä½æãã¾ãã
aws s3api create-bucket --bucket $BUCKET_NAME --create-bucket-configuration LocationConstraint=$AWS_DEFAULT_REGION
å®äºç¢ºèªããã¾ããæå®ããååã®bucketãã§ãã¦ããã°æåã§ãã
aws s3 ls | grep $BUCKET_NAME
DynamoDBãä½æãã¾ãã ä»åã¯ãnameã¨ageãã£ã¼ã«ããæã¡ãã¹ã«ã¼ããããæå°ã«åºå®ãããã¼ãã«ãä½æãããã¨ã«ãã¾ãã(é©å½ã«å¤æ´ãã¦ã大ä¸å¤«ã§ã)
aws dynamodb create-table \ --table-name $TABLE_NAME \ --attribute-definitions \ AttributeName=name,AttributeType=S \ AttributeName=age,AttributeType=N \ --key-schema \ AttributeName=name,KeyType=HASH \ AttributeName=age,KeyType=RANGE \ --provisioned-throughput \ ReadCapacityUnits=1,WriteCapacityUnits=1
å®äºç¢ºèªãå ¼ãã¦ããã¼ãã«ã®ARNãåå¾ãã¦ããã¾ãã
TABLE_ARN=$(aws dynamodb describe-table --table-name $TABLE_NAME --query Table.TableArn --output text) && echo $TABLE_ARN
PITR(Point-in-Time-Recovery)ã®æå¹åãè¡ãã¾ããOpenSearch Ingestionåæãã¼ã¿å©ç¨æã«å¿ è¦ã§ãã
aws dynamodb update-continuous-backups --table-name $TABLE_NAME --point-in-time-recovery-specification PointInTimeRecoveryEnabled=true # ç¢ºèª aws dynamodb describe-continuous-backups --table-name $TABLE_NAME
DynamoDB StreamãNEW_IMAGEã«å¤æ´ãã¾ãã ( â»å¤æ´çºçæãæ°ããã¤ã¡ã¼ã¸ããã£ããã£ã§ããããã«ãªãã¾ãã)
aws dynamodb update-table --table-name $TABLE_NAME --stream-specification StreamEnabled=true,StreamViewType=NEW_IMAGE # ç¢ºèª aws dynamodb describe-table --table-name $TABLE_NAME --query Table.StreamSpecification
2. Pipelineã®Roleãä½æ
å ã«Roleã ãä½ãã¾ããPolicyã¯æé 4ã§ä½æãã¾ã(collectionãä½æãããã®IDãæã«å ¥ãã¦ããpolicyãä½æãããããã§ãã)
ã¾ã㯠Trust policy ããã¡ã¤ã«ã«ä¿åãã¦ããã¾ãã ããã®Roleã使ãã®ã¯osis-pipelines.amazonaws.comã§ãããã¨ç¤ºãããã®ãã®ã§ãã
IAM_ROLE_PRINCIPAL='osis-pipelines.amazonaws.com' FILE_IAM_ROLE_DOC="role-document.json" cat << EOF > ${FILE_IAM_ROLE_DOC} { "Version": "2012-10-17", "Statement": [ { "Action": "sts:AssumeRole", "Principal": { "Service": "${IAM_ROLE_PRINCIPAL}" }, "Effect": "Allow", "Sid": "" } ] } EOF cat ${FILE_IAM_ROLE_DOC}
ä¸è¨ã®Trust policyãæå®ãã¦Roleãä½æãã¾ãã
aws iam create-role --role-name $IAM_ROLE_NAME --assume-role-policy-document file://$FILE_IAM_ROLE_DOC
ä½æããRoleã確èªãã¾ãã
aws iam list-roles --query "Roles[?RoleName == '${IAM_ROLE_NAME}'].RoleName"
3. Amazon OpenSearch Collectionã®ä½æ
å°ãè¤éã§ããã以ä¸ã®ãªã½ã¼ã¹ãä½æãã¾ãã
- Data access policies
- ãªã½ã¼ã¹ããªã·ã¼ã§ããå ã»ã©ä½æããRoleã«ã¢ã¯ã»ã¹è¨±å¯ãåºãã¾ãã
- Encryption policies
- å¿ é ã§ããOpenSearch collectionä¿åãã¼ã¿ã®æå·åæ¹æ³ãæå®ãã¾ãã
- Network policies
- ä»åã¯Publicã¢ã¯ã»ã¹ã許å¯ãã¾ããAWSãã°ã¤ã³ã§ããã·ã¥ãã¼ããéããããã«ãªãã¾ãã
- collection
- ä»åã¯Domainã§ã¯ãªããOpenSearchã®serverlessçã使ãã¾ãã
ãã¤ã³ã: ä¸éãã¼ãã«ã®ãããªãªãã¸ã§ã¯ãã¯ããã¾ãããåã«ä½ãäºå®ã®collectionåãæå®ããå種ããªã·ã¼ãåãã£ã¦ä½æãã¦ãããã¨ã§ããªã·ã¼ãæå¹åãã¾ãã ç¹ã«Encryption policiesã¯å ã«ä½ã£ã¦ãããªãã¨ãcollectionãä½æãããã¨ãã§ãã¾ããã
Data access policiesã®ä½æ
policy documentããã¡ã¤ã«ã«ä¿åãã¦ããã¾ãã ä»åã¯CLIã¦ã¼ã¶ã¨å ã»ã©ä½æããRoleã許å¯ãã¾ããGUIã§ã確èªããããã°ãAWSã®ããã³ã³ã«ãã°ã¤ã³ãã¦ããã¦ã¼ã¶ã追å ã§æå®ãã¦ãã ããã(å¾ãã追å ãããã¨ãã§ãã¾ã)
FILE_ACCESS_POLICY_DOC="access-policy-document.json" CALLER_ARN=$(aws sts get-caller-identity --query Arn --output text) && echo $CALLER_ARN IAM_ROLE_ARN=$(aws iam get-role --role-name $IAM_ROLE_NAME --query Role.Arn --output text) && echo $IAM_ROLE_ARN cat << EOS > $FILE_ACCESS_POLICY_DOC [ { "Rules": [ { "Resource": [ "index/${COLLECTION_NAME}/*" ], "Permission": [ "aoss:CreateIndex", "aoss:UpdateIndex", "aoss:DescribeIndex", "aoss:ReadDocument", "aoss:WriteDocument" ], "ResourceType": "index" } ], "Principal": [ "${IAM_ROLE_ARN}", "${CALLER_ARN}" ], "Description": "Rule 1" } ] EOS cat $FILE_ACCESS_POLICY_DOC
Data access policiesãä½æãã¾ãã
詳ããã¯aws opensearchserverless create-access-policy help
ãã¿ã¦ãã ããã
ãªãã2024/06ç¾å¨ typeã¯data以å¤é¸æã§ãã¾ããã
ACCESS_POLICY_NAME=access-policy aws opensearchserverless create-access-policy --name $ACCESS_POLICY_NAME --policy file://$FILE_ACCESS_POLICY_DOC --type data
Encryption policiesã®ä½æ
policy documentããã¡ã¤ã«ã«ä¿åãã¦ããã¾ããä»åã¯AWSãæä¾ããkeyã使ãã®ã§ãAWSOwnedKeyãtrueã«ãã¾ãã
FILE_ENC_POLICY_DOC="access-policy-document.json" cat << EOS > $FILE_ENC_POLICY_DOC { "Rules": [ { "ResourceType": "collection", "Resource": [ "collection/${COLLECTION_NAME}" ] } ], "AWSOwnedKey": true } EOS cat $FILE_ENC_POLICY_DOC
Encryption policiesãä½æãã¾ãã ãããã«ããã§ããããã¡ãã¯create-security-policyã®APIã使ããtypeã¨ãã¦encryptionãæå®ãã¾ãã
ENC_POLICY_NAME=encryption-policy aws opensearchserverless create-security-policy --name $ENC_POLICY_NAME --policy file://$FILE_ENC_POLICY_DOC --type encryption
Network policiesã®ä½æ
Network policiesãä½æãã¾ãã policy documentãã¾ããã¡ã¤ã«ã«æ¸ãè¾¼ã¿ã¾ã
FILE_NET_POLICY_DOC="access-policy-document.json" cat << EOS > $FILE_NET_POLICY_DOC [ { "Rules": [ { "ResourceType": "dashboard", "Resource": [ "collection/${COLLECTION_NAME}"] }, { "ResourceType": "collection", "Resource": [ "collection/${COLLECTION_NAME}"] } ], "AllowFromPublic": true } ] EOS cat $FILE_NET_POLICY_DOC
Network policiesãä½æãã¾ãã ãããã«ããã§ããããã¡ãã¯create-security-policyã®APIã使ããtypeã¨ãã¦networkãæå®ãã¾ãã
NET_POLICY_NAME=network-policy aws opensearchserverless create-security-policy --name $NET_POLICY_NAME --policy file://$FILE_NET_POLICY_DOC --type network
Collectionãä½æãã¾ãã ãã¹ãç¨ãªã®ã§replicasã¯ç¡å¹ã«ãã¦ãtypeã¯SEARCHã«ãã¦ããã¾ãã
aws opensearchserverless create-collection --name $COLLECTION_NAME --standby-replicas DISABLED --type SEARCH
4. Policyã®ä½æ
pipelineã«ä»ä¸ããPolicyãä½æãã¾ãã
ã¾ãã¯å¿ è¦ãªæ¨©éãJSONã§è¨è¼ãããã¡ã¤ã«ã«ä¿åãã¦ããã¾ãã
COLLECTION_ID=$(aws opensearchserverless batch-get-collection --names $COLLECTION_NAME --query 'collectionDetails[].id' --output text) && echo $COLLECTION_ID FILE_IAM_POLICY_DOC="policy-document.json" cat << EOS > $FILE_IAM_POLICY_DOC { "Version": "2012-10-17", "Statement": [ { "Sid": "allowRunExportJob", "Effect": "Allow", "Action": [ "dynamodb:DescribeTable", "dynamodb:DescribeContinuousBackups", "dynamodb:ExportTableToPointInTime" ], "Resource": [ "${TABLE_ARN}" ] }, { "Sid": "allowCheckExportjob", "Effect": "Allow", "Action": [ "dynamodb:DescribeExport" ], "Resource": [ "${TABLE_ARN}/export/*" ] }, { "Sid": "allowReadFromStream", "Effect": "Allow", "Action": [ "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:GetShardIterator" ], "Resource": [ "${TABLE_ARN}/stream/*" ] }, { "Sid": "allowReadAndWriteToS3ForExport", "Effect": "Allow", "Action": [ "s3:GetObject", "s3:AbortMultipartUpload", "s3:PutObject", "s3:PutObjectAcl" ], "Resource": [ "${BUCKET_ARN}/${PATH_PREFIX1}/*", "${BUCKET_ARN}/${PATH_PREFIX2}/*" ] }, { "Action": [ "aoss:BatchGetCollection", "aoss:APIAccessAll" ], "Effect": "Allow", "Resource": "arn:aws:aoss:${AWS_DEFAULT_REGION}:${ACCOUNT_ID}:collection/${COLLECTION_ID}" }, { "Action": [ "aoss:CreateSecurityPolicy", "aoss:GetSecurityPolicy", "aoss:UpdateSecurityPolicy" ], "Effect": "Allow", "Resource": "*", "Condition": { "StringEquals": { "aoss:collection": "${COLLECTION_NAME}" } } } ] } EOS cat $FILE_IAM_POLICY_DOC
Policyãä½æãã¾ãã
aws iam create-policy --policy-name ${IAM_POLICY_NAME} --policy-document file://$FILE_IAM_POLICY_DOC
ä½æããPolicyã確èªãã¾ãã
IAM_POLICY_ARN="arn:aws:iam::${ACCOUNT_ID}:policy/${IAM_POLICY_NAME}" && echo $IAM_POLICY_ARN aws iam get-policy --policy-arn $IAM_POLICY_ARN
æé 2ã§ä½æããpipelineç¨ã®Roleã«ãPolicyãç´ã¤ãã¾ãã
aws iam attach-role-policy --role-name $IAM_ROLE_NAME --policy-arn $IAM_POLICY_ARN
Policyãç´ã¤ãã¦ãããã¨ã確èªãã¾ãã
aws iam list-attached-role-policies --role-name $IAM_ROLE_NAME
5. Pipelineã®ä½æ
ãã¡ãã®å ¬å¼ããã¥ã¡ã³ããåèã«ãã¦ãã ããã
Amazon OpenSearch Ingestion パイプラインの作成 - Amazon OpenSearch サービス
Pipelineã®ETLå¦çã«å½ããé¨åã¯ymlã§å®ç¾©ãã¾ãã0ããè¨è¼ããã®ã¯å¤§å¤ãªã®ã§ãå ¬å¼ãç¨æãã¦ããblueprintã使ãã¾ãã blueprintã®ä¸è¦§ã確èªããDynamoDBã®ååãå ¥ã£ã¦ãããã®ãæ¢ãã¾ãã
aws osis list-pipeline-blueprints
AWS-DynamoDBChangeDataCapturePipeline
ã使ãã¾ãããã
blueprintãæã«å ¥ãã¾ãã
aws osis get-pipeline-blueprint --blueprint-name AWS-DynamoDBChangeDataCapturePipeline --query Blueprint.PipelineConfigurationBody --output text > blueprint.yml
blueprintãå ã«ãETL(DynamoDBã®ãã¼ãã«ã®ä¸èº«ãOpenSearchã«ã³ãã¼ãã)ã«å¿ è¦ãªæ å ±ãè¨å®ãã¦ããã¾ãã
hostsã®è¨å®å 容ã«ã¤ãã¦ã¯ãcollectionãä½æå®äºããendpointãæã«å ¥ãã¾ã§ãã°ããå¾ ã¡ã¾ãã 以ä¸ã®ã³ãã³ããCREATINGã ã¨ãendpointãå¾ããã¾ãããACTIVEã«ãªãã¾ã§å¾ ã¡ã¾ãã
aws opensearchserverless batch-get-collection --names $COLLECTION_NAME --query 'collectionDetails[].status'
collectionãå®äºãã¦ããã°ã以ä¸ã使ã£ã¦ymlãæ¸ãã¦ãOKã§ãã(blueprintãç·¨éããå¾ã®å 容ä¾ã§ãã)
TABLE_ARN=$(aws dynamodb describe-table --table-name $TABLE_NAME --query Table.TableArn --output text) && echo $TABLE_ARN IAM_ROLE_ARN=$(aws iam get-role --role-name $IAM_ROLE_NAME --query Role.Arn --output text) && echo $IAM_ROLE_ARN HOST=$(aws opensearchserverless batch-get-collection --names $COLLECTION_NAME --query 'collectionDetails[].collectionEndpoint' --output text) && echo $HOST FILE_INGESTION_DOCUMENT=injestion.yml cat << EOS > $FILE_INGESTION_DOCUMENT version: "2" dynamodb-pipeline: source: dynamodb: acknowledgments: true tables: - table_arn: "${TABLE_ARN}" stream: start_position: "LATEST" export: s3_bucket: "${BUCKET_NAME}" s3_region: "${AWS_DEFAULT_REGION}" s3_prefix: "${PATH_PREFIX1}/" aws: sts_role_arn: "${IAM_ROLE_ARN}" region: "${AWS_DEFAULT_REGION}" sink: - opensearch: hosts: [ "${HOST}" ] index: "\${getMetadata(\"table_name\")}" index_type: custom normalize_index: true document_id: "\${getMetadata(\"primary_key\")}" action: "\${getMetadata(\"opensearch_action\")}" document_version: "\${getMetadata(\"document_version\")}" document_version_type: "external" aws: sts_role_arn: "${IAM_ROLE_ARN}" region: "${AWS_DEFAULT_REGION}" serverless: true dlq: s3: bucket: "${BUCKET_NAME}" key_path_prefix: "${PATH_PREFIX2}/dlq" region: "${AWS_DEFAULT_REGION}" sts_role_arn: "${IAM_ROLE_ARN}" EOS cat $FILE_INGESTION_DOCUMENT
ãã¤ãã©ã¤ã³ãä½æãã¾ãã min/max-unitã¯èªã¿æ¸ãã®capacityã§ãæéãããã®èª²éã«å½±é¿ãã¾ãã OpenSearch Compute Unit (OCU)ã¨è¨ãããåä½ã§ãã
aws osis create-pipeline \ --pipeline-name $PIPELINE_NAME \ --min-units 1\ --max-units 2\ --pipeline-configuration-body file://${FILE_INGESTION_DOCUMENT}
ãã¡ããä½æã«æéãæ°åãããã¾ãã
6. ãã¹ãå®è¡
DynamoDBã«ãã¼ã¿ãæå ¥ãã¦ã¿ã¾ãã
aws dynamodb put-item \ --table-name $TABLE_NAME \ --item '{"name": {"S": "saki"}, "age": {"N": "16"}, "height": {"N": "152"}}' aws dynamodb put-item \ --table-name $TABLE_NAME \ --item '{"name": {"S": "temari"}, "age": {"N": "15"}, "height": {"N": "162"}}' aws dynamodb put-item \ --table-name $TABLE_NAME \ --item '{"name": {"S": "kotone"}, "age": {"N": "15"}, "height": {"N": "156"}}'
OpenSearchã確èªãã¾ããåæ ãããã¾ã§ãå°ãæéããããããããã¾ããã
awscurl --service aoss --region $AWS_DEFAULT_REGION -X GET ${HOST}/_cat/indices awscurl --service aoss --region $AWS_DEFAULT_REGION -X GET ${HOST}/${TABLE_NAME}/_search | jq .
以ä¸ã®ããã«ãDynamoDBã«æå ¥ãããã¼ã¿ãOpenSearchã«åæ ããã¦ããã°æåã§ãï¼
{ "took": 1494, "timed_out": false, "_shards": { "total": 0, "successful": 0, "skipped": 0, "failed": 0 }, "hits": { "total": { "value": 3, "relation": "eq" }, "max_score": 1, "hits": [ { "_index": "ingestion-table", "_id": "temari|15", "_score": 1, "_source": { "name": "temari", "age": 15, "height": 162 } }, { "_index": "ingestion-table", "_id": "kotone|15", "_score": 1, "_source": { "name": "kotone", "age": 15, "height": 156 } }, { "_index": "ingestion-table", "_id": "saki|16", "_score": 1, "_source": { "name": "saki", "age": 16, "height": 152 } } ] } }
ãªã½ã¼ã¹ã®åé¤
以ä¸ã®é ã«ãªã½ã¼ã¹ãåé¤ãã¦ããã¾ãã
Pipeline
aws osis delete-pipeline --pipeline-name $PIPELINE_NAME
Amazon OpenSearch Service Collection
COLLECTION_ID=$( aws opensearchserverless list-collections --query "collectionSummaries[?name=='${COLLECTION_NAME}'].id" --output text) && $COLLECTION_ID aws opensearchserverless delete-collection --id $COLLECTION_ID
Amazon OpenSearch Service 3種ã®ããªã·ã¼
aws opensearchserverless delete-access-policy --name $ACCESS_POLICY_NAME --type data aws opensearchserverless delete-security-policy --name $ENC_POLICY_NAME --type encryption aws opensearchserverless delete-security-policy --name $NET_POLICY_NAME --type network
Amazn DynamoDB
aws dynamodb delete-table --table-name $TABLE_NAME
Amazn S3
aws s3 rm s3://$BUCKET_NAME --recursive aws s3api delete-bucket --bucket $BUCKET_NAME
Pipelineã«ä»ä¸ãã¦ããPolicyã¨Role
aws iam detach-role-policy --role-name $IAM_ROLE_NAME --policy-arn $IAM_POLICY_ARN aws iam delete-policy --policy-arn $IAM_POLICY_ARN aws iam delete-role --role-name $IAM_ROLE_NAME
ä½è«: ChatGPT 㨠AWS CLI
AWS CLI ã CDK ã¯ChatGPTã¨ç¸æ§ããããæ示ãé©åã§ããã°ãªã½ã¼ã¹ä½æã«ããã¦ã¨ã¦ã便å©ã§ããããããè¤æ°ã®ãªã½ã¼ã¹ãçµã¿åãããæ£ç¢ºãªæ§ææé ãä¸åº¦ã«åºåããã®ã¯ã¾ã ã¾ã é£ããããã§ãã
å人çã«ChatGPTã便å©ã ã¨æããã®ã¯ä»¥ä¸ã®ã±ã¼ã¹ã§ãã
- 1ãªã¯ã¨ã¹ãåä½ã§ã®è³ªå(nameãxxã¨ããs3ãã±ãããä½ãæ¹æ³ãæãã¦ããªã©)
--query
ãã©ã¡ã¼ã¿ã®è¨è¿°æ¹æ³ã質åãã- ãªã½ã¼ã¹ä½æãã°ãä¸ããããããã®ãªã½ã¼ã¹ã®åé¤æ¹æ³ãæãã¦ã
AWS CLI ã CDK ã¯ãªã½ã¼ã¹ã®ä½ææé ãæ®ããã¨è¨ãæå³ã§éå®ãã¦ãã¾ããããã©ã¡ã¼ã¿ãªã©ã調ã¹ãã®ãããé¢åã§ãã å人çã«ã¯aws cliã®ãã«ãã¨CDKã®äºæ¸¬å¤æã¨ããã¥ã¡ã³ã(typescript)ãä½µç¨ãã¤ã¤ãChatGPTã«è³ªåããªããæçµçã«CDKã®ã³ã¼ãã«è½ã¨ãè¾¼ãã®ãä¸çªæ¥½ã ã¨æãã¦ãã¾ãã
ã¾ã¨ã
Amazon Dynamodbã¨Amazon OpenSearch Serviceã®zero-ETL integrationãAWS CLIã§æ§ç¯ããæé ãã¾ã¨ãã¾ããã AWSã¯ãªã½ã¼ã¹ãè²ã ãã£ã¦ãããããã§ãããGUIã®ãã¥ã¼ããªã¢ã«ãããªããå¾ã«åããã®ãCLIã§æ§ç¯ãããã¼ãã«ã¾ã¨ãã¦ããã¨ç解ãæ·±ã¾ãã¨æã£ã¦ãã¾ãã CDKã¯CLIã¨åãAPIã®ä¸ã«ä½ããã¦ããã®ã§ãå¾ã CDKãè¨è¿°ããæã楽ã«ãªãã¾ãã
ãDjangoãä¾æã§ç解ããselect_relatedã¨prefetch_relatedãã¿ã¼ã³é
- åæ¸ã
- åèãªã³ã¯
- ç°å¢
- äºåæºå: ã¢ãã«ã®ä½æã¨ãã¼ã¿æå ¥
- No.0 çºè¡ãããSQLã確èªãã
- No.1 select_relatedã§è¦ªãåã
- No.2 select_relatedã§è¦ªã®è¦ªãåã
- No.3 prefetch_relatedã§è¤æ°ä»¶ã®å¤ãåã
- No.4 Prefetchãªãã¸ã§ã¯ãã§å¤ãfilter
- No.5 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ManyToMany-ManyToMany
- No.6 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ForeignKey-ManyToMany
- No.7 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ForeignKey-ManyToMany + select_related
- No.8 Prefetchã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³ãorder_byãã
- No.9 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ManyToMany-ForeignKey
- No.10 to_attrã§è¤æ°ã®çµãè¾¼ã¿ããã
- No.11 1ã¤å ã2ã¤å ã®ãªã¬ã¼ã·ã§ã³ãããããæ¡ä»¶ä»ãã§Prefetch(ManyToMany-ManyToMany)
- No.12 親ãã¼ã¹ã§1件ã®åãåå¾ãã
åæ¸ã
å»å¹´æ«ãDjangoãæ¸ãã¦ã¾ããã
Djangoã®ORMã¯ç°¡æ½ãªè¨è¿°ã§SQLã®çºè¡ã¨Pythonãªãã¸ã§ã¯ããæ©æ¸¡ããã¦ããã¦ä¾¿å©ã§ãã
ãããä½ãèããã«ä½¿ã£ã¦ããã¨SQLã®çºè¡æ°ãå¢ãã¦ãã¦ãããã©ã¼ãã³ã¹ãã©ãã©ãä¸ãã£ã¦ãã¾ãã å¹çã®ããSQLãçºè¡ãã¦ãããããã«select_relatedã¨prefetch_relatedã使ç¨ãã¾ãã
èªåã¯æ¯åããã®ãã¿ã¼ã³ã©ããããã ã£ã...ãã¨å¿ãã¦ãã¾ãã®ã§ããã¿ã¼ã³éãä½æãã¾ããã
以ä¸ã®ãããªäººåãã§ãã
- ã¢ãã«ãè¤éã«ãªã£ã¦ããã¨æ··ä¹±ãã¦ãã¯ã¨ãªåæ¸å®è£ ã®æãæ¢ã¾ã£ã¦ãã¾ãã
- select_related/prefetch_relatedã®åºæ¬çãªä½¿ç¨æ¹æ³ã¨ç®çã¯ç解ãã¦ããã
- SQLãæä½éç解ãã¦ãã(INNER JOIN, OUTER JOIN, WHERE, IN ãããã®åºæ¬æ§æ)
ãµã³ãã«ã¯å ¨ã¦django shellã§å®è¡ãã¦ããããã¼ã¿ä½æã³ã¼ããä¹ãã¦ããã®ã§ãããã«è©¦ããã¨ãã§ãã¾ãã
åèãªã³ã¯
æ¬è¨äºã¯å ¬å¼ããã¥ã¡ã³ãã®ä¾ãå ã«ãè£è¶³ã追å ããå 容ã§ãã
ç°å¢
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Ventura | 13.5.2 |
Python3 | 3.11.4 |
Django | 5.0.1 |
Djangoã®ç°å¢ãæ§ç¯ããæ¹æ³ã¯ä»¥ä¸ãåèã«ãã¦ãã ããã
äºåæºå: ã¢ãã«ã®ä½æã¨ãã¼ã¿æå ¥
以ä¸ã®ããã¥ã¡ã³ãã§åºããã¦ããä¾ãå°ãæ¹å¤ãã¦ãã¾ãã
QuerySet API ãªãã¡ã¬ã³ã¹ | Django ããã¥ã¡ã³ã | Django
ãã¶ããããã³ã°ãã¬ã¹ãã©ã³ã®ã¢ãã«ã§ãã
- ãã¶ã¨ãããã³ã°ã¯å¤:å¤ã®é¢ä¿ã
- ã¬ã¹ãã©ã³ã¯pizzasãã£ã¼ã«ãã§ãã¶ã¨å¤:å¤ã®é¢ä¿ã«ããã¾ãã(æä¾ãããã¶å ¨ã¦)
- ã¬ã¹ãã©ã³ã¯best_pizzaãã£ã¼ã«ãã§ãã¶ã¨å¤:1ã®é¢ä¿ã«ããã¾ãã(ä¸çªäººæ°ã®ãã¶)
- ãã®æããã¶ã親ã§ã.
from django.db import models class Country(models.Model): name = models.CharField(max_length=256) def __str__(self): return self.name class Topping(models.Model): name = models.CharField(max_length=256) def __str__(self): return self.name class Pizza(models.Model): name = models.CharField(max_length=256) country = models.ForeignKey( Country, related_name='pizza', null=True, on_delete=models.CASCADE) toppings = models.ManyToManyField(Topping) def __str__(self): return self.name class Restaurant(models.Model): name = models.CharField(max_length=256) # 追å pizzas = models.ManyToManyField(Pizza, related_name='restaurants') best_pizza = models.ForeignKey( Pizza, related_name='championed_by', on_delete=models.CASCADE) def __str__(self): return self.name
ãã¤ã°ã¬ã¼ã·ã§ã³ãã¾ãã
(env) $ python manage.py makemigrations (env) $ python manage.py migrate
ãã¼ã¿æå ¥ãããã®ã§ãdjango shellã«å ¥ãã¾ãã
(env) $ python manage.py shell
ã·ã§ã«å é¨ã§ä»¥ä¸ã®ã³ã¼ããå®è¡ãã¾ãã
from app.models import Topping, Pizza, Restaurant, Country i = Country.objects.create(name='ã¤ã¿ãªã¢') Topping.objects.create(name='ããã') Topping.objects.create(name='ãã¯ã«ã¹') Topping.objects.create(name='ãã¼ã³ã³') Topping.objects.create(name='ãã¤ãããã«') Topping.objects.create(name='ãã¼ãº') Topping.objects.create(name='ç¼ãé') pizza_A = Pizza.objects.create(name='ãã¶A') pizza_A.toppings.set(Topping.objects.filter(name__in=['ããã', 'ãã¯ã«ã¹', 'ãã¼ã³ã³'])) pizza_A.country = i pizza_A.save() pizza_B = Pizza.objects.create(name='ãã¶B') pizza_B.toppings.set(Topping.objects.filter(name__in=['ããã', 'ãã¯ã«ã¹', 'ãã¤ãããã«', 'ãã¼ãº'])) pizza_C = Pizza.objects.create(name='ãã¶C') pizza_C.toppings.set(Topping.objects.filter(name__in=['ããã', 'ç¼ãé'])) restaurant_1 = Restaurant.objects.create(name='ã¬ã¹ãã©ã³1', best_pizza=pizza_A) restaurant_1.pizzas.set(Pizza.objects.filter(name__in=['ãã¶A', 'ãã¶B'])) restaurant_2 = Restaurant.objects.create(name='ã¬ã¹ãã©ã³2', best_pizza=pizza_C) restaurant_2.pizzas.set(Pizza.objects.filter(name__in=['ãã¶A', 'ãã¶C']))
No.0 çºè¡ãããSQLã確èªãã
é常SQLã®çºè¡ã¯django-debug-toolbarãå°å ¥ããããlogãã確èªãããã¨ãå¤ãã¨æãã¾ãã ä»åã¯django shellã ãã§å®çµãããã®ã§ãdjango.db.connection.queriesã確èªãã¾ãã ããã«çºè¡ãããSQLã®å±¥æ´ãå ¥ã£ã¦ãã¾ãã
django shellã¸ã®å ¥ãæ¹ãåæ²ãã¾ãã
(env) $ python manage.py shell
ã¾ããçºè¡ãããSQLé¨åã®ã¿ã確èªãããã®ã§ã以ä¸ã®ãã«ãã¼é¢æ°ãå®ç¾©ãã¦ããã¾ãã(django shellã«è²¼ãä»ããã°OKã§ã)
from django.db import reset_queries, connection def f(q): for qt in q: print(qt['sql']) # ã¯ã¨ãªç¢ºèª f(connection.queries) # ãªã»ãã reset_queries()
以éãå ¨ã¦django shellå é¨ã§å®è¡ãã¦ãã¾ãã
No.1 select_relatedã§è¦ªãåã
ã¬ã¹ãã©ã³: ä¸çªäººæ°ã®ãã¶ã¯å¤:1ã®é¢ä¿ã§ãã ã¬ã¹ãã©ã³(å)ããæ¤ç´¢ããã¨ãä¸çªäººæ°ã®ãã¶(親)ã¯1ã¤ã«ãã ã¾ãã¾ãã
ç´ ã®ç¶æ ã ã¨ã¬ã¹ãã©ã³ãåå¾ããã¯ã¨ãªã«å ããåå¾ãããã¬ã¹ãã©ã³ã®æ°ã ãä¸çªäººæ°ã®ãã¶ãåãã¯ã¨ãªãçºè¡ããã¦ãã¾ãã¾ãã
>>> for restaurant in Restaurant.objects.all(): print(f'{restaurant.name}åºã®ä¸çªäººæ°ã®ãã¶ã¯{restaurant.best_pizza.name}') ã¬ã¹ãã©ã³1åºã®ä¸çªäººæ°ã®ãã¶ã¯ãã¶A ã¬ã¹ãã©ã³2åºã®ä¸çªäººæ°ã®ãã¶ã¯ãã¶C
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id" FROM "app_restaurant" SELECT "app_pizza"."id", "app_pizza"."name" FROM "app_pizza" WHERE "app_pizza"."id" = 1 LIMIT 21 SELECT "app_pizza"."id", "app_pizza"."name" FROM "app_pizza" WHERE "app_pizza"."id" = 3 LIMIT 21
ãã®ãããªå ´åãSQLã§ã¯è¦ªãçµåãã¦åå¾ãã¾ãã Djangoã®ORMã§ã¯ãselect_relatedã«ãã£ã¦çµåãå¯è½ã§ãã
çºè¡ãããSQLã確èªããã¨ã確ãã«INNER JOINããã¦ãããã¯ã¨ãªçºè¡æ°ã¯1件ã¨ãªã£ã¦ãã¾ããããã¾ããã
>>> reset_queries() # åæåãã¦ããã¾ã. >>> for restaurant in Restaurant.objects.select_related('best_pizza').all(): print(f'{restaurant.name}åºã®ä¸çªäººæ°ã®ãã¶ã¯{restaurant.best_pizza.name}') ã¬ã¹ãã©ã³1åºã®ä¸çªäººæ°ã®ãã¶ã¯ãã¶A ã¬ã¹ãã©ã³2åºã®ä¸çªäººæ°ã®ãã¶ã¯ãã¶C
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_restaurant" INNER JOIN "app_pizza" ON ("app_restaurant"."best_pizza_id" = "app_pizza"."id")
No.2 select_relatedã§è¦ªã®è¦ªãåã
ããã«ã¢ã³ãã¼ã¹ã³ã¢ã«ãã£ã¦ã親ã®è¦ªã®...ã¨è¾¿ããã¨ãã§ãã¾ãã å¾åã¯ãLEFT OUTER JOINã¨ãªã£ã¦ãããã¨ã«æ³¨æãã¦ãã ããã(Countryã¯ãã¶Aã«ã ãè¨å®ãã¦ãã¾ãã)
for restaurant in Restaurant.objects.select_related('best_pizza__country').all(): print(f'{restaurant.name}åºã®ä¸çªäººæ°ã®ãã¶ã¯{restaurant.best_pizza.name}({restaurant.best_pizza.country})') ã¬ã¹ãã©ã³1åºã®ä¸çªäººæ°ã®ãã¶ã¯ãã¶A(ã¤ã¿ãªã¢) ã¬ã¹ãã©ã³2åºã®ä¸çªäººæ°ã®ãã¶ã¯ãã¶C(None)
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id", "app_country"."id", "app_country"."name" FROM "app_restaurant" INNER JOIN "app_pizza" ON ("app_restaurant"."best_pizza_id" = "app_pizza"."id") LEFT OUTER JOIN "app_country" ON ("app_pizza"."country_id" = "app_country"."id")
ã¨ããã§ã以ä¸ã®ãããªã³ã¼ããæ¸ãå¿ è¦ã¯ããã¾ããã 親ã®è¦ªã®è¦ª...ã¨è¾¿ãæã¯ãä¸çªé ã親ãæå®ããã°è¯ãã§ãã
ããã«ã¢ã³ãã¼ã¹ã³ã¢ã§æå®ããã°ããã®éä¸ã®ãã¼ãã«ããã¡ãã¨çµåããã¾ãã
# åé·ãªä¾. best_pizzaã®æå®ã¯ä¸è¦. select_related('best_pizza', 'best_pizza__country')
No.3 prefetch_relatedã§è¤æ°ä»¶ã®å¤ãåã
ãã解説ããã¦ããåºæ¬ã®å½¢ã§ãã
ãã¶ã¨ãããã³ã°ã¯å¤:å¤ã®é¢ä¿ã§ãããã¶ããã¼ã¹ã«ãã¦åå¾ãã¾ãã
åå¾ããããããã®ãã¶ã®ããããã³ã°ãå ¨ã¦åå¾ããã ã¨ããã±ã¼ã¹ãèãã¾ãã 以ä¸ã®ããã«ã¢ã¯ã»ã¹ããã¨ãã¶ãåå¾ããã¯ã¨ãª(1ã¤ã)ã«å ããåå¾ãããã¶ã®æ°ã ãããã®ãã¶ã«ç´ã¤ããããã³ã°ãåå¾ããã¯ã¨ãªãçºè¡ããã¦ãã¾ãã¾ãã
f(connection.queries) reset_queries() for pizza in Pizza.objects.all(): print(f'{pizza.name}', ','.join([t.name for t in pizza.toppings.all()])) ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ãã¶B ããã,ãã¯ã«ã¹,ãã¤ãããã«,ãã¼ãº ãã¶C ããã,ç¼ãé
>>> f(connection.queries) SELECT "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" SELECT "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" = 1 SELECT "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" = 2 SELECT "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" = 3
ãã®ãããªã±ã¼ã¹ã§ã¯ãSQLã¨ããç°ãªãæ¹æ³ãåãã¾ãã
prefetch_relatedã«ãã£ã¦ãããã³ã°ããããããå¥ã®ã¯ã¨ãªã§åå¾ããPythonã³ã¼ãã«ãã£ã¦çµåãã¾ã.
prefetch_relatedã¯ãã£ãã·ã¥æ©è½(Pythonå´ã®æ©è½)ã ã¨æèããã¨ãç解ããããã¨èªåã¯æãã¾ãã
ã§ã¯å®éã«ã¯ã¨ãªãè¦ã¦ã¿ã¾ãã
>>> reset_queries() >>> for pizza in Pizza.objects.prefetch_related('toppings').all(): print(f'{pizza.name}', ','.join([t.name for t in pizza.toppings.all()])) ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ãã¶B ããã,ãã¯ã«ã¹,ãã¤ãããã«,ãã¼ãº ãã¶C ããã,ç¼ãé
>>> f(connection.queries) SELECT "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" IN (1, 2, 3)
1ã¤ãã®ã¯ã¨ãªã§ãã¶(idã1,2,3)ãåå¾ããå¾ã«ã2ã¤ãã®ã¯ã¨ãªãçºè¡ããã¦ãã¾ãã
ãã¶ã®IDãINå¥ã§å ¨ã¦æå®ããå¿ è¦ã«ãªããããã³ã°ãå ¨é¨ãããããåå¾ããã¯ã¨ãªã§ãã ãã®ã¯ã¨ãªã®çµæããã£ãã·ã¥ãã¦ãããä¸è¨ã®printã§å¿ è¦ã«ãªã£ãæã«å©ç¨ãã¦ããã¤ã¡ã¼ã¸ã§ãã
表ããã¯è¦ãã¾ããããPythonã®ã³ã¼ãã«ãããã£ãã·ã¥ãã該å½é¨åãè¦ã¤ãã¦ãã¾ãã
(è£è¶³) ä¸è¨ã®èª¬æã®ã½ã¼ã¹ã¯ãã¡ãã QuerySet API ãªãã¡ã¬ã³ã¹ | Django ããã¥ã¡ã³ã | Django
No.4 Prefetchãªãã¸ã§ã¯ãã§å¤ãfilter
prefetch_relatedã¯ä¸è¨ã®ããã«ãã£ãã·ã¥ããä»çµã¿ã§ãã
ãªã®ã§ããã£ãã·ã¥ããã¯ã¨ãªã¨ã¯ç°ãªããã¿ã¼ã³ã§ã¢ã¯ã»ã¹ããã¨ããããprefetch_relatedã®åã ãã¯ã¨ãªãå¢ãã¦ç¡é§ã«ãªãã¾ã
以ä¸ã®ä¾ã¯ãã¢ã¯ã»ã¹æã«filterã使ã£ã¦ãã¾ãããã®å ´åãprefetchããçµæã¯å©ç¨ãããå度SQLãçºè¡ããã¾ãã
- prefetch_relatedã§ã¯allãæå®.
- ã¢ã¯ã»ã¹æã«ã¯filterãæå®.
for pizza in Pizza.objects.prefetch_related('toppings').all(): print(f'{pizza.name}', ','.join([t.name for t in pizza.toppings.filter(id__gte=3)])) ãã¶A ãã¼ã³ã³ ãã¶B ãã¤ãããã«,ãã¼ãº ãã¶C ç¼ãé
>>> f(connection.queries) SELECT "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" IN (1, 2, 3) SELECT "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE ("app_pizza_toppings"."pizza_id" = 1 AND "app_topping"."id" >= 3) SELECT "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE ("app_pizza_toppings"."pizza_id" = 2 AND "app_topping"."id" >= 3) SELECT "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE ("app_pizza_toppings"."pizza_id" = 3 AND "app_topping"."id" >= 3)
prefetchããåããã£ã«ã¿ã¼ããæã¯ãPrefetchãªãã¸ã§ã¯ãã§æå®ãã¾ãã
toppingså´ã¯ãåã«allãæå®ãã¦ãã¾ãããã¡ãã¨ãã£ã«ã¿ããã¦ãã¾ãã ãã®æ¸ãæ¹ã¯ãallã®çµæãä¸æ¸ããã¦ãã¾ããããªã¤ã¡ã¼ã¸ã§ãã
from django.db.models import Prefetch >>> reset_queries() >>> for pizza in Pizza.objects.prefetch_related(Prefetch('toppings', queryset=Topping.objects.filter(id__gte=3))): print(f'{pizza.name}', ','.join([t.name for t in pizza.toppings.all()])) ãã¶A ãã¼ã³ã³ ãã¶B ãã¤ãããã«,ãã¼ãº ãã¶C ç¼ãé
>>> f(connection.queries) SELECT "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE ("app_topping"."id" >= 3 AND "app_pizza_toppings"."pizza_id" IN (1, 2, 3))
to_attrå±æ§ãæå®ãããã¨ã§ãPrefetchãªãã¸ã§ã¯ãã§ã«ã¹ã¿ã ãããã£ãã·ã¥ã«æ示çã«ã¢ã¯ã»ã¹ã§ããã®ã§ããã¡ãã®è¨è¿°æ¹æ³ãããããã§ãã
>>> for pizza in Pizza.objects.prefetch_related(Prefetch('toppings', queryset=Topping.objects.filter(id__gte=3), to_attr='filtered_toppings')): print(f'{pizza.name}', ','.join([t.name for t in pizza.filtered_toppings]))
No.5 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ManyToMany-ManyToMany
ããã«ã¢ã³ãã¼ã¹ã³ã¢ã§ç¹ããã¨ã§æå®ã§ãã¾ãã
- ã¬ã¹ãã©ã³ããã¼ã¹ã«ãæä¾ãããã¶ã¨ãã®ãããã³ã°ãå ¨ã¦åå¾ããã
- ã¬ã¹ãã©ã³âï¸ãã¶âï¸ãããã³ã°
- ã¬ã¹ãã©ã³ã«ç´ã¤ãå
¨ã¦ã®ãã¶ãåå¾ããã
- ãã¶ã«ç´ã¤ãå ¨ã¦ã®ãããã³ã°ãåå¾ããã
- ã¬ã¹ãã©ã³ã«ç´ã¤ãå
¨ã¦ã®ãã¶ãåå¾ããã
for restaurant in Restaurant.objects.prefetch_related('pizzas__toppings').all(): print(f'{restaurant.name}åºã®ãã¶ä¸è¦§') for pizza in restaurant.pizzas.all(): print(f'\t{pizza.name}', ','.join([t.name for t in pizza.toppings.all()])) ã¬ã¹ãã©ã³1åºã®ãã¶ä¸è¦§ ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ãã¶B ããã,ãã¯ã«ã¹,ãã¤ãããã«,ãã¼ãº ã¬ã¹ãã©ã³2åºã®ãã¶ä¸è¦§ ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ãã¶C ããã,ç¼ãé
- prefetch_relatedãªããªãåã¬ã¹ãã©ã³ãåãã¶ãã¨ã«ã¯ã¨ãªãçºçãã¾ãã
- prefetch_relatedã§ä»¥ä¸ã®3ã¤ã®ã¯ã¨ãªã«ã¾ã¨ã¾ãã¾ãã
- ã¬ã¹ãã©ã³ä¸è¦§ã®åå¾
- ã¬ã¹ãã©ã³ã®IDãINå¥ã§æå®ãã対å¿ãããã¶ãå ¨ã¦åå¾ã
- ãã¶ã®IDãINå¥ã§æå®ãã対å¿ãããããã³ã°ãå ¨ã¦åå¾ã
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id" FROM "app_restaurant" SELECT ("app_restaurant_pizzas"."restaurant_id") AS "_prefetch_related_val_restaurant_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" INNER JOIN "app_restaurant_pizzas" ON ("app_pizza"."id" = "app_restaurant_pizzas"."pizza_id") WHERE "app_restaurant_pizzas"."restaurant_id" IN (1, 2) SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" IN (1, 2, 3)
No.6 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ForeignKey-ManyToMany
â»æ¬¡ã®No.7ã®å£åçã§ã
No.5ã¨åãããããã«ã¢ã³ãã¼ã¹ã³ã¢ã§ç¹ããã¨ãã§ãã¾ããFKã§çµåããã¢ãã«ãéã«ãã£ã¦ãåé¡ããã¾ããã
- ã¬ã¹ãã©ã³ï¸âä¸çªäººæ°ã®ãã¶âï¸ãããã³ã°
- ã¬ã¹ãã©ã³ã«FKã§ç´ã¤ãä¸çªäººæ°ã®ãã¶
- ãã¶ã«ç´ã¤ãå ¨ã¦ã®ãããã³ã°åå¾
for restaurant in Restaurant.objects.prefetch_related('best_pizza__toppings').all(): print(f'{restaurant.name}åºã®ä¸çªäººæ°ã®ãã¶') print(f'\t{restaurant.best_pizza.name}', ','.join([t.name for t in restaurant.best_pizza.toppings.all()])) ã¬ã¹ãã©ã³1åºã®ä¸çªäººæ°ã®ã㶠ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ã¬ã¹ãã©ã³2åºã®ä¸çªäººæ°ã®ã㶠ãã¶C ããã,ç¼ãé
- prefetch_relatedãªããªãåãã¶ãã¨ã«ã¯ã¨ãªãçºçãã¾ãã
- prefetch_relatedã§3ã¤ã®ã¯ã¨ãªã«ã¾ã¨ãããã¨ãã§ãã¾ãã
- ã¬ã¹ãã©ã³ä¸è¦§ã®åå¾
- ã¬ã¹ãã©ã³ã®IDãINå¥ã§æå®ãã対å¿ããä¸çªäººæ°ã®ãã¶ãåå¾(FKãªã®ã§1件ã®ã¿)ã
- ãã¶ã®IDãINå¥ã§æå®ãã対å¿ãããããã³ã°ãå ¨ã¦åå¾ã
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id" FROM "app_restaurant" SELECT "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" WHERE "app_pizza"."id" IN (1, 3) SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" IN (1, 3)
No.7 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ForeignKey-ManyToMany + select_related
No.6ã®æ¹åçã§ããForeignKeyã®é¨åã¯ãprefetchããselect_relatedã使ã£ã¦SQLã¬ãã«ã§å¹çåããæ¹ãåªãã¦ãã¾ãã
主ã¯ã¨ãª(select_relatedã®"å¾"ã«äºåèªã¿è¾¼ã¿(prefetch)ãèµ°ãã®ãã¤ã¡ã¼ã¸ããã¨åãããããã§ãã
for restaurant in Restaurant.objects.select_related('best_pizza').prefetch_related('best_pizza__toppings').all(): print(f'{restaurant.name}åºã®ä¸çªäººæ°ã®ãã¶') print(f'\t{restaurant.best_pizza.name}', ','.join([t.name for t in restaurant.best_pizza.toppings.all()])) ã¬ã¹ãã©ã³1åºã®ä¸çªäººæ°ã®ã㶠ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ã¬ã¹ãã©ã³2åºã®ä¸çªäººæ°ã®ã㶠ãã¶C ããã,ç¼ãé
- 2ã¤ã®ã¯ã¨ãªã«ã¾ã¨ãããã¨ãã§ãã¾ãã
- ã¬ã¹ãã©ã³ä¸è¦§ + ä¸çªäººæ°ã®ãã¶(FK)ãåæã«åå¾
- ãã¶ã®IDãINå¥ã§æå®ãã対å¿ãããããã³ã°ãå ¨ã¦åå¾ã
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_restaurant" INNER JOIN "app_pizza" ON ("app_restaurant"."best_pizza_id" = "app_pizza"."id") SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" IN (1, 3)
No.8 Prefetchã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³ãorder_byãã
- ã¬ã¹ãã©ã³âï¸ãã¶âï¸ãããã³ã°: ãããã³ã°ãnameã®éé ã«ãã
- ä¸éã«ãããã¶ãprefetchããã¦ããç¹ã«æ³¨æ(No.5ããã¼ã¹ã«èãã¾ã)
>>> for restaurant in Restaurant.objects.prefetch_related(Prefetch('pizzas__toppings', queryset=Topping.objects.order_by('-name'))): print(f'{restaurant.name}åºã®ãã¶ä¸è¦§') for pizza in restaurant.pizzas.all(): print(f'\t{pizza.name}', ','.join([t.name for t in pizza.toppings.all()])) ã¬ã¹ãã©ã³1åºã®ãã¶ä¸è¦§ ãã¶A ãã¼ã³ã³,ãã¯ã«ã¹,ããã ãã¶B ãã¯ã«ã¹,ãã¤ãããã«,ããã,ãã¼ãº ã¬ã¹ãã©ã³2åºã®ãã¶ä¸è¦§ ãã¶A ãã¼ã³ã³,ãã¯ã«ã¹,ããã ãã¶C ç¼ãé,ããã
- ãã¶IDãINã«æå®ããããããã³ã°åå¾ã¯ã¨ãªã«ORDER_BYãã¤ãã¾ãã
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id" FROM "app_restaurant" SELECT ("app_restaurant_pizzas"."restaurant_id") AS "_prefetch_related_val_restaurant_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" INNER JOIN "app_restaurant_pizzas" ON ("app_pizza"."id" = "app_restaurant_pizzas"."pizza_id") WHERE "app_restaurant_pizzas"."restaurant_id" IN (1, 2) SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE "app_pizza_toppings"."pizza_id" IN (1, 2, 3) ORDER BY "app_topping"."name" DESC
No.9 prefetch_relatedã§2ã¤å ã®ãªã¬ã¼ã·ã§ã³: ManyToMany-ForeignKey
No.7ã¯ãã¯ã¨ãªå¯¾è±¡ã®ç´æ¥ã®é¢é£å ãForeignKeyããã®å ãManyToManyã§ããã ä»åã¯ç´æ¥ã®é¢é£å ãManyToManyã§ããã®å ã«ForeignKeyã§çµåããã¢ãã«ãããå ´åã§ãã
Pretetchã§åå¾ããã¨ãã対象ã®FKãJOINãããæ示 ã¨ããã¤ã¡ã¼ã¸ãåãããããã¨æãã¾ãã Prefetchã®querysetã§select_relatedã使ãã®ããã¤ã³ãã§ãã
- ãã¶âï¸ã¬ã¹ãã©ã³âä¸çªäººæ°ã®ãã¶
- ãã¶âï¸ã¬ã¹ãã©ã³: prefetchã§å¥ã¯ã¨ãªã«ãã¾ã(ãã¶IDãINå¥æå®)
- ã¬ã¹ãã©ã³âä¸çªäººæ°ã®ãã¶: select_relatedã§SQLã§çµåããç¶æ ã§åå¾ãã¾ãã
for pizza in Pizza.objects.prefetch_related(Prefetch('restaurants', queryset=Restaurant.objects.select_related('best_pizza'))): print(f'{pizza.name}ãæä¾ããã¦ãã¬ã¹ãã©ã³ä¸è¦§') for restaurant in pizza.restaurants.all(): print(f'\t{restaurant}ã®ä¸çªäººæ°ã®ãã¶ã¯: {restaurant.best_pizza.name}') ãã¶Aãæä¾ããã¦ãã¬ã¹ãã©ã³ä¸è¦§ ã¬ã¹ãã©ã³1ã®ä¸çªäººæ°ã®ãã¶ã¯: ãã¶A ã¬ã¹ãã©ã³2ã®ä¸çªäººæ°ã®ãã¶ã¯: ãã¶C ãã¶Bãæä¾ããã¦ãã¬ã¹ãã©ã³ä¸è¦§ ã¬ã¹ãã©ã³1ã®ä¸çªäººæ°ã®ãã¶ã¯: ãã¶A ãã¶Cãæä¾ããã¦ãã¬ã¹ãã©ã³ä¸è¦§ ã¬ã¹ãã©ã³2ã®ä¸çªäººæ°ã®ãã¶ã¯: ãã¶C
æåã«åå¾ãããã¶IDãINå¥ã«ãã¦ã¾ã¨ãã¦ã¬ã¹ãã©ã³ãä¸çºã§åãã¦ãããã¨ã«æ³¨ç®ãã¦ãã ããã ããã«ãããããã®ã¬ã¹ãã©ã³ã®æè¯ãã¶ã¯SQLã®æç¹ã§çµåã§ãã¦ãã¾ãã
>>> f(connection.queries) SELECT "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" SELECT ("app_restaurant_pizzas"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id", T4."id", T4."name", T4."country_id" FROM "app_restaurant" INNER JOIN "app_restaurant_pizzas" ON ("app_restaurant"."id" = "app_restaurant_pizzas"."restaurant_id") INNER JOIN "app_pizza" T4 ON ("app_restaurant"."best_pizza_id" = T4."id") WHERE "app_restaurant_pizzas"."pizza_id" IN (1, 2, 3)
çç¥ãã¾ãããprefetchãªã©ãã¤ããªãç¶æ ã ã¨ä»¥ä¸ã®ããã«ã¯ã¨ãªãããããçºè¡ããã¾ãã - ãã¶ä¸è¦§ãåå¾ããããããã®ãã¶IDã«å¯¾ãã¦ã¬ã¹ãã©ã³ã1件ãã¤ã¯ã¨ãªã§åå¾ã - ãã®ã¬ã¹ãã©ã³ã®æè¯ãã¶IDãã¯ã¨ãªã«ãã¦ãå度ãã¶ãåå¾ããã
No.10 to_attrã§è¤æ°ã®çµãè¾¼ã¿ããã
- åã対象ãè¤æ°ã®ãã¿ã¼ã³ã§åæã«çµã£ã¦ä½¿ãããã±ã¼ã¹ã
- ã¬ã¹ãã©ã³âï¸ã㶠ã§ããã¶ãè¤æ°ã®æ¹æ³ã§çµã
- ããã¬ã¹ãã©ã³ã«ã²ãã¤ããã¤ã¿ãªã¢ã®ãã¶ä¸è¦§ ã¨ å ¨ã¦ã®ãã¶ä¸è¦§ãåæã«åå¾ããã
ä½è«ã§ãããitaly_pizzaã¨all_pizzaãå®ç¾©ããæç¹ã§ã¯SQLã¯çºè¡ããã¦ããªãã¨ããç¹ã大äºã§ãã QuerySetã¯é 延è©ä¾¡ãªã®ã§ãå®éã®å¤ãåå¾ããã¾ã§çºè¡ããã¾ããã
italy_pizza = Pizza.objects.filter(country=italy) all_pizza = Pizza.objects.all() for restaurant in Restaurant.objects.prefetch_related(Prefetch('pizzas', queryset=italy_pizza, to_attr='italy'), Prefetch('pizzas', queryset=all_pizza, to_attr='all_pizzas')): print(f'{restaurant.name}åº') print('\t', ','.join([pizza.name for pizza in restaurant.italy])) print('\t', ','.join([pizza.name for pizza in restaurant.all_pizzas])) ã¬ã¹ãã©ã³1åº ãã¶A ãã¶A,ãã¶B ã¬ã¹ãã©ã³2åº ãã¶A ãã¶A,ãã¶C
Prefetchãæå®ããæ°ã ããSQLãå¢ãã¾ãã
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id" FROM "app_restaurant" SELECT ("app_restaurant_pizzas"."restaurant_id") AS "_prefetch_related_val_restaurant_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" INNER JOIN "app_restaurant_pizzas" ON ("app_pizza"."id" = "app_restaurant_pizzas"."pizza_id") WHERE ("app_pizza"."country_id" = 1 AND "app_restaurant_pizzas"."restaurant_id" IN (1, 2)) SELECT ("app_restaurant_pizzas"."restaurant_id") AS "_prefetch_related_val_restaurant_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" INNER JOIN "app_restaurant_pizzas" ON ("app_pizza"."id" = "app_restaurant_pizzas"."pizza_id") WHERE "app_restaurant_pizzas"."restaurant_id" IN (1, 2)
No.11 1ã¤å ã2ã¤å ã®ãªã¬ã¼ã·ã§ã³ãããããæ¡ä»¶ä»ãã§Prefetch(ManyToMany-ManyToMany)
- 1ã¤å ã®ManyToManyãfilteræ¡ä»¶ä»ãã§prefetchãã2ã¤å ãfilterãã¦åå¾ãããã¿ã¼ã³.
- to_attrã§åä»ãããã¨ã§ã2ã¤å ãããã«ã¢ã³ãã¼ã¹ã³ã¢ã§æå®å¯è½ã«ãã¾ãã
- 以ä¸ããã¾ã§è¨è¿°ããé¡ä¼¼ãã¿ã¼ã³
- No.5: all(1ã¤å )ãall(2ã¤å )ã ã£ãã
- No.8: all(1ã¤å )ã2ã¤å ãorder_by
- No.9: all(1ã¤å )ã2ã¤å ã¯select_related
2ã¤å (ãããã³ã°)ãåå¾ããæã«ããã§ã«ãã£ã«ã¿ããã1ã¤å (ãã¶)ã«é¢é£ãããã®ã ãåå¾ããããã¨ããã®ããã¤ã³ãã§ãã
italy_pizza = Pizza.objects.filter(country=italy) for restaurant in Restaurant.objects.prefetch_related(Prefetch('pizzas', queryset=italy_pizza, to_attr='italy'), Prefetch('italy__toppings', queryset=Topping.objects.filter(id__lte=2), to_attr='topping_2')): print(f'{restaurant.name}åº') for pizza in restaurant.italy: print(f'\t{pizza.name}', ','.join([t.name for t in pizza.topping_2])) ã¬ã¹ãã©ã³1åº ãã¶A ããã,ãã¯ã«ã¹ ã¬ã¹ãã©ã³2åº ãã¶A ããã,ãã¯ã«ã¹
ä¸ã¤å (Pizza)ãã¬ã¹ãã©ã³ã®åå¾çµæã¨country=1ã§ãã£ã«ã¿ããã¦ããã ãã®çµæã®ãã¶IDã3ã¤ãã®toppingåå¾æã«INå¥ã§ä½¿ããid<=2ã®ãã£ã«ã¿ãåæã«å®è¡ãã¦ããã
>>> f(connection.queries) SELECT "app_restaurant"."id", "app_restaurant"."name", "app_restaurant"."best_pizza_id" FROM "app_restaurant" SELECT ("app_restaurant_pizzas"."restaurant_id") AS "_prefetch_related_val_restaurant_id", "app_pizza"."id", "app_pizza"."name", "app_pizza"."country_id" FROM "app_pizza" INNER JOIN "app_restaurant_pizzas" ON ("app_pizza"."id" = "app_restaurant_pizzas"."pizza_id") WHERE ("app_pizza"."country_id" = 1 AND "app_restaurant_pizzas"."restaurant_id" IN (1, 2)) SELECT ("app_pizza_toppings"."pizza_id") AS "_prefetch_related_val_pizza_id", "app_topping"."id", "app_topping"."name" FROM "app_topping" INNER JOIN "app_pizza_toppings" ON ("app_topping"."id" = "app_pizza_toppings"."topping_id") WHERE ("app_topping"."id" <= 2 AND "app_pizza_toppings"."pizza_id" IN (1))
2ã¤å ããã£ã«ã¿ããªããªã以ä¸ã®ããã«æ¸ãã°è¯ãã§ãã
>>> for restaurant in Restaurant.objects.prefetch_related(Prefetch('pizzas', queryset=italy_pizza, to_attr='italy'), 'italy__toppings'): print(f'{restaurant.name}åº') for pizza in restaurant.italy: print(f'\t{pizza.name}', ','.join([t.name for t in pizza.toppings.all()])) ã¬ã¹ãã©ã³1åº ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³ ã¬ã¹ãã©ã³2åº ãã¶A ããã,ãã¯ã«ã¹,ãã¼ã³ã³
No.12 親ãã¼ã¹ã§1件ã®åãåå¾ãã
- åãã親ã¢ãã«.get(æ¤ç´¢æ¡ä»¶)ãããããªã±ã¼ã¹
- åå¾çµæã¯1件ã§ãListå½¢å¼ã«ãªããããã¯ã©ããããããªãã£ã½ãã
- prefetchã§æå®ãã0çªç®ãåå¾ããã
åå´ããæ¤ç´¢ãã¦ãé¢é£ãã親ãselect_relatedãããã¨ãã§ãããªããã¡ããæ¡ç¨ãã.
qtatsuã®æé æ¸
åæ¸ã
ãã®è¨äºã¯èªåã®ããã°è¨äºããã®åç §ç¨ã§ãã 追è¨ãå¤æ´ãé »ç¹ã«ããäºå®ã§ãã
1. Djangoã®ç°å¢æ§ç¯
åæ
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Ventura | 13.5.2 |
Python3 | 3.11.4 |
åèãªã³ã¯
å ¬å¼ããã¥ã¡ã³ãã§ã.
- How to install Django | Django documentation | Django
- Writing your first Django app, part 1 | Django documentation | Django
æé
ä»®æ³ç°å¢ã®ä½æ
$ python3 -m venv env $ source env/bin/activate # å®äºç¢ºèª (env) $ python --version Python 3.11.4
以ä¸ãä»®æ³ç°å¢ä¸ã§ä½æ¥ãã¾ãã(env)ãã¤ãã¦ããç¶æ ã§ãã
pipã®ã¢ãããã¼ã
(env) $pip install --upgrade pip
Djangoã®ã¤ã³ã¹ãã¼ã«
(env) $ pip install Django==5.0.1
ããã¸ã§ã¯ãã®ç«ã¡ä¸ã
projectã¨ãããã£ã¬ã¯ããªãä½æãã¦ããã®ä¸ã§startprojectã³ãã³ããå®è¡ãã¾ãã
(env) $ mkdir project (env) $ cd project (env) $ django-admin startproject config . (env) $ ls config/ manage.py*
.
ãæå®ãããã¨ã§ãmanage.pyãä»ãããã£ã¬ã¯ããª(project)ã«ã§ããããã«ãªãã¾ãã詳ããã¯ä»¥ä¸ã®ãªã³ã¯ãåèã«ãã¦ãã ããã
django-admin and manage.py | Django documentation | Django
ã¢ããªã®è¿½å
(env) $ python manage.py startapp app
è¨å®ãã¡ã¤ã«ã®INSTALLED_APPSãªã¹ãã«ä»¥ä¸ã追è¨ãã.
INSTALLED_APPS = [ 'app.apps.AppConfig', # 追å ...çç¥... ]
ãã¤ã°ã¬ã¼ã·ã§ã³
$ python manage.py migrate
å®äºç¢ºèª
$ python manage.py runserver 8000
æå®ãããã¼ãã«ã¢ã¯ã»ã¹ãã¾ãã ä¸è¨ã®ããã«å®è¡ããã®ãªã http://localhost:8000/ ã§ãã
ãã±ããã®ãã¼ã¸ãåºã¦ãããæåã§ãã
ãPythonã並ã³é ãç¡è¦ãã¦listã®è¦ç´ ãæ¯è¼ããæ¹æ³3ã¤ãsort, assertCountEqual, deepdiffã
(â» qiitaã«æ¸ããè¨äºã®ãåãåãã¼ã¸ã§ã³ã§ã)
【Python】並び順を無視してリストを比較するテスト(DeepDiff) - Qiita
- çµè«: deepdiffã使ã
- åæ¸ã
- åèãªã³ã¯
- ç°å¢
- æååã®ãªã¹ã: sortãã¤ãã.
- è¾æ¸ã®ãªã¹ã: keyãæå®ãã¦ã½ã¼ããã.
- è¾æ¸ã®ãªã¹ã: assertCountEqualã使ã.
- valueã«ãªã¹ãããã¤è¾æ¸ã®ãªã¹ã: DeepDiffã使ã
- çµè«
- ã¾ã¨ã
çµè«: deepdiffã使ã
å®ç¾ãããæ¡ä»¶ã¯ä»¥ä¸ã®äºã¤.
- è¾æ¸ãè¦ç´ ã¨ãã¦æã¤ãªã¹ããããããåããªã¹ããããæ¯ã¹ãã.
- ãã ã並ã³é ã¯ç°ãªã£ã¦ãã¦ãè¯ããã¨ã¨ããã
- è¾æ¸ã®ããvalue(ä¸ä¾ã§ã¯
spells
)ããªã¹ãã¨ãªã£ã¦ããã- ãã¡ãã®è¦ç´ ã並ã³é ã¯ç°ãªã£ã¦ãã¦ãè¯ããã¨ã¨ããã
>>> dict_in_list1 [ {'name': 'Reimu', 'spells': ['Musouhuin', 'niju-kekkai']}, {'name': 'Marisa', 'spells': ['non-directional laser', 'star-dust reverie']}, {'name': 'Alice', 'spells': ['hourai-doll', 'shanghai-doll']} ] >>> dict_in_list2 [ {'name': 'Marisa', 'spells': ['star-dust reverie', 'non-directional laser']}, {'name': 'Reimu', 'spells': ['Musouhuin', 'niju-kekkai']}, {'name': 'Alice', 'spells': ['hourai-doll', 'shanghai-doll']} ]
DeepDiffã使ãã¨ã以ä¸ã®ããã«ãã¦åããã¼ã¿ã§ããããæ¯è¼ã§ãã.
pytest
assert not DeepDiff(dict_in_list1, dict_in_list2, ignore_order=True)
unittest
self.assertEqual(DeepDiff(dict_in_list1, dict_in_list2, ignore_order=True), {})
åæ¸ã
ãã¹ãã³ã¼ããæ¸ãã¦ããã¨ãasserté¨åã巨大ã«ãªã£ã¦ãã¾ããè¦éããç®çã®ææ¡ãé£ãããªã£ã¦ãããã¨ãããã¾ãã
ãã¡ããããªãã¹ããã¹ããå°ããåä½ã§æ¸ããã¯ãªãã£ã«ã«ãªé¨åã®ã¿ãã§ãã¯ãããªã©ã®å·¥å¤«ããããã¨ã第ä¸ã§ã¯ããã¾ãã
ããããAPIã®è¿ãå¤ããããå¦çã®çµæãªã©ãããã®ã¾ã¾ç¢ºãããã..ã¨ãã以ä¸ã®ãããªã±ã¼ã¹ãããã¨æãã¾ã.
- ããç¨åº¦ã®ãã¼ã¿ã®çµã¿ãæãã¨æå³ã®éããã¼ã¿ã«ãªããã®.
- ãã¹ããæ¸ããªããéçº/ãªãã¡ã¯ã¿ãã¦ãããç¾å¨ã®è¿ãå¤ãå£ãã¦ããªããã¨ãããªãã¡ã¯ã¿ä¸ã«ç¢ºããããã¹ãããµãã¨ç¨æããã.
ãã®ãããªã±ã¼ã¹ã§ã¯ããªãã¸ã§ã¯ãããã®ã¾ã¾æ¯è¼ããããªãã¾ããããªã¹ãã®è¦ç´ ã並ã³é ãç¡è¦ãã¦æ¯è¼ããã¨ãã¯ãè¦ç´ ã®åã«ãã£ã¦ã¨ã¦ãé£ãããªã£ã¦ãã¾ãã¾ãã
ä»åã¯ãDeepDiffã®ä»ãsortããæ¹æ³ãassertCoutEqualã¡ã½ãããç¨ããæ¹æ³ãç´¹ä»ãããã¨æãã¾ã.
åèãªã³ã¯
GitHub - seperman/deepdiff: Deep Difference and search of any Python object/data.
unittest --- ã¦ããããã¹ããã¬ã¼ã ã¯ã¼ã¯ â Python 3.10.0b2 ããã¥ã¡ã³ã
ç°å¢
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Big Sur | 11.6 |
Python3 | 3.9.1 |
deepdiff | 5.7.0 |
æååã®ãªã¹ã: sortãã¤ãã.
ååã®æ¹ã¯sortããã使ãã¨ãèããã¾ããã
è¦ãç®ã«ãä½ããã£ã¦ãããåããããããã·ã³ãã«ã«ããããããå¯è½ãªéããã¡ãã使ãã¹ãã ã¨æãã¾ã .
æååãªã©ãã½ã¼ãã§ãã(ã¤ã¾ã<
æ¼ç®åã§å¤§å°ãæ¯ã¹ããã¨ãã§ããã__lt__
ãå®ç¾©ããã¦ãã)å ´åã¯ç°¡åã§ã.
以ä¸ã®ãµãã¤ã®ãªã¹ããæ¯è¼ãã¾ãã
å«ã¾ãã¦ããè¦ç´ ã¯åãã§ãããé çªãç°ãªã£ã¦ãããã¨ã«æ³¨æãã¾ãã(åãçµæã ã¨å¤å®ããã.)
names1 = ["Reimu", "Alice", "Marisa"] names2 = ["Reimu", "Marisa", "Alice"]
ãªã¹ãã®æ¯è¼ã¯è¦ç´ ãé ããé çªã«æ¯è¼ããã®ã§ããã®ã¾ã¾ã§ã¯ãã¡ã§ã.
>>> names1 == names2
False
ã½ã¼ããã¾ãã
>>> sorted(names1) == sorted(names2) True
è¾æ¸ã®ãªã¹ã: keyãæå®ãã¦ã½ã¼ããã.
ä»åº¦ã®ä¾ã¯è¾æ¸(dict)ããªã¹ãã®è¦ç´ ã¨ãã¦ä¸¦ãã§ãã¾ãã
å ã»ã©ã¨ä¼¼ã¦ãã¾ãããä»åã¯ãã®ã¾ã¾ã§ã¯ã½ã¼ãã§ãã¾ããã
ä»åã®ä¾ããå«ã¾ãã¦ããè¦ç´ ã¯åãã§ãããé çªãç°ãªã£ã¦ãããã¨ã«æ³¨æãã¾ãã(åãçµæã ã¨å¤å®ããã.)
names1 = [ {"name": "Reimu"}, {"name": "Marisa"}, {"name": "Alice"}, ] names2 = [ {"name": "Alice"}, {"name": "Reimu"}, {"name": "Marisa"}, ]
ã¾ãããã®ã¾ã¾æ¯è¼ããã¨ãã¯Falseã¨ãªãã¾ãã(並ã³é ã¯ç°ãªã£ã¦ãã¦ãããã®ã§, å®éã¯Trueã ã¨å¤å®ããã)
>>> names1 == names2
False
dictå士ã¯<
ã®æåãå®ç¾©ããã¦ããªãã®ã§sortã§ãã¾ããã
>>> sorted(names1) TypeError: '<' not supported between instances of 'dict' and 'dict'
ãã®å ´åãããããã®dictãå¿ ãnameå±æ§ããã¡ãéè¤ããªããªãã°ãkeyãæå®ãããã¨ã§ã½ã¼ããããã¨ãå¯è½ã§ã.
keyãæå®ããã¨ãkey: nameã®valueã®å¤ã§æ¯è¼ãã¦ã½ã¼ããããã¨ã«ãªãã¾ãã
>>> sorted(names1, key=lambda x: x["name"]) [ {'name': 'Alice'}, {'name': 'Marisa'}, {'name': 'Reimu'} ]
keyã«æ¸¡ããé¢æ°lambdaã®ãå¼æ°xã«ådictãé çªã«æ¸¡ããã¾ããããã¦ãdictã®nameãã¼ã«ãããvalueãåãåºããã¦æ¯è¼ã«ä½¿ãããã¤ã¡ã¼ã¸ã§ãã
çµæãé çªã«é¢ä¿ãªãåãè¦ç´ ãå«ãã§ãããã¨ã確ããããã¾ãã.
>>> sorted(names1, key=lambda x: x["name"]) == sorted(names2, key=lambda x: x["name"]) True
keyã§åå¾ããå¤ã¯ã¿ãã«ã«ãªã£ã¦ãè¯ãã®ã§ãnameãã¼ã ãã§ã¯ã½ã¼ãã§ããªãå ´åã対å¿ã§ããã¨æãã¾ã(æªæ¤è¨¼)ã
ãããããã¹ãã³ã¼ãã§ã®ä½¿ç¨ãæ³å®ãã¦ããå ´åãçµæã®æ¯è¼ã®ããã«è¤éãªã½ã¼ãæ¡ä»¶ãæ¸ããã¨ã¯æ éã«èããæ¹ãè¯ãã¨æãã¾ãã
è¾æ¸ã®ãªã¹ã: assertCountEqualã使ã.
ãã®ãããªã±ã¼ã¹ã§ãPythonæ¨æºã®ãã¼ã«ã«ã¯ããä¸ã¤å¼·åãªãã®ãããã¾ãã
unittestã¢ã¸ã¥ã¼ã«ã§ãTestCaseã¯ã©ã¹ã«å®è£ ããã¦ããassertãã«ãã¼ã¡ã½ããã®ã²ã¨ã¤ãassertCountEqualã§ãã
ååã®å°è±¡ã¨ã¯ããªãç°ãªãã¾ããããé çªã«ãããåãè¦ç´ ãåãæ°ã ãããããã¨ãæ¤è¨¼ã§ããassertã¡ã½ããã¨ãªã£ã¦ãã¾ãã
unittest --- ã¦ããããã¹ããã¬ã¼ã ã¯ã¼ã¯ â Python 3.10.0b2 ããã¥ã¡ã³ã
unittestã使ã£ã¦ããå ´åã«ã¯ãself.assertCountEqual
ãå¼ã³åºãã°è¯ãã ãã§ãã®ã§ãä»åã¯pytestãªã©ããã使ç¨ã§ãããããTestCaseãã¤ã³ã¹ã¿ã³ã¹åãã¦ä½¿ãæé ã示ãã¾ã.
æ¯è¼ããã®ã¯ãå ã»ã©keyæå®ãã¦ã½ã¼ããã¦ããdict in listã§ãã
names1 = [ {"name": "Reimu"}, {"name": "Marisa"}, {"name": "Alice"}, ] names2 = [ {"name": "Alice"}, {"name": "Reimu"}, {"name": "Marisa"}, ]
>>> from unittest import TestCase >>> case = TestCase() >>> case.assertCountEqual(names1, names2) # OK!!
ã¨ã¦ã楽ã§ããï¼
å ¬å¼ããã¥ã¡ã³ãã«ä»çµã¿ã«ã¤ãã¦èª¬æãããã¾ãããã»ã¨ãã©ã®çµã¿è¾¼ã¿åã¯ä½ãæèããã«æ¯è¼ã§ãã¾ããèªåã®å ´åãã大æµã®å ´åã¯sortã¨assertCountEqualã®ã©ã¡ããã§äºè¶³ãã¦ãã¾ãã
assertEqual() ã¡ã½ããã¯ãåãåã®ãªãã¸ã§ã¯ãã®ç価æ§ç¢ºèªã®ããã«ãåãã¨ã«ç¹æã®ã¡ã½ããã«ãã£ã¹ããããã¾ãããããã®ã¡ã½ããã¯ãã»ã¨ãã©ã®çµã¿è¾¼ã¿åç¨ã®ã¡ã½ããã¯æ¢ã«å®è£ ããã¦ãã¾ããããã«ã addTypeEqualityFunc() ã使ãäºã§æ°ããªã¡ã½ãããç»é²ãããã¨ãã§ãã¾ã.
é£ç¹ã¯ä»åã®ç®çã§ã¯ãã¡ã½ããå称ãä¸èªç¶ã«ãªã ç¹ãã¨æãã¾ãã
ãã®ååã¯ãä»çµã¿èªä½ãã¨ã¦ããã表ãã¦ãã¾ããåºç¾ãããªãã¸ã§ã¯ããã(collectionã¢ã¸ã¥ã¼ã«ã®)Counterã使ã£ã¦æ°ãä¸ãã¦ããããã§ãã
ãªã®ã§ãé çªãæ°ã«ãããªã¹ããæ¯è¼ãããï¼ãã¨ããã®ã¯ãå¯è½ã§ã¯ããã®ã§ããæ¬æ¥ã®ä½¿ãæ¹ã¨ã¯ã¡ãã£ã¨ããã¦ããã®ããªãã¨æãã¾ãã
å¤åã§ãããassertCountEqualã®æ¬æ¥ã®ä½¿ãæ¹ã¯ãä¸ã®ãããªã±ã¼ã¹ã ã¨æãã¾ãã
>>> fruits1 = ["ããã", "ã¿ãã", "ã¿ãã", "ããã", "ããã"] >>> fruits2 = ["ããã", "ã¿ãã", "ã¿ãã", "ããã", "ã¿ãã"] >>> case.assertCountEqual(fruits1, fruits2) AssertionError: Element counts were not equal: First has 3, Second has 2: 'ããã' First has 2, Second has 3: 'ã¿ãã'
ãã¼ããã¨ã©ã¼ã¡ãã»ã¼ã¸ãåãããããã§ãã...!!!
valueã«ãªã¹ãããã¤è¾æ¸ã®ãªã¹ã: DeepDiffã使ã
ãªã¹ãã®æ¯è¼ã¯ã大æµã¯sortedã¨assertCountEqualã§å¯è½ãã¨æãã¾ãã
ã¨ããããããã以ä¸è¤éãªæ¯è¼ããããªããããããã¹ãã®æ§æãæ¯è¼ã®ä»æ¹ãèãç´ããæ¹ããããã¨æãã¾ãã
ããããåé ã«ãæ¸ãã¾ããããAPIã®è¿ãå¤ãªã©ããå®éã«å©ãã¦ã¿ã¦ãã¨ã£ãå¤ããã®ã¾ã¾ãã¹ãã«ä½¿ãããã¨ããã·ã¼ã³ãæã ããã¾ãã
使ãæ¨ã¦ã®ã¹ã¯ãªããããã°ã¬ããªãããã«ä¿®æ£ããã¨ãã®ä¸æçãªãã¹ãã³ã¼ããä½ãæãªã©ã«ã¯ãèªåã¯ãã®ãããªassertãæ¸ããããªãã¾ãã
ãã®å ´åãããªã¹ãã®è¦ç´ ãè¾æ¸ããã¤ããè¾æ¸ã®valueã«ããªã¹ãããããããã®ãªã¹ãã®é çªãç¡è¦ãã¦è¦ç´ ãä¸è´ãã¦ããã確èªãããã±ã¼ã¹ãããã¾ãã
ä¾ãåºãã¨ããããªæãã§ãã
>>> dict_in_list1 [ {'name': 'Reimu', 'spells': ['Musouhuin', 'niju-kekkai']}, {'name': 'Marisa', 'spells': ['non-directional laser', 'star-dust reverie']}, {'name': 'Alice', 'spells': ['hourai-doll', 'shanghai-doll']} ] >>> dict_in_list2 [ {'name': 'Marisa', 'spells': ['star-dust reverie', 'non-directional laser']}, {'name': 'Reimu', 'spells': ['Musouhuin', 'niju-kekkai']}, {'name': 'Alice', 'spells': ['hourai-doll', 'shanghai-doll']} ]
ä¸ã®2ã¤ã®ãªã¹ãã¯ãè¦ç´ ã§ããdictã®é çªãå ¥ãæ¿ãã£ã¦ãã¾ãã
ããã«ãname: Marisaã®é ç®ãè¦ãã¨ãspellsè¦ç´ ã¯ãªã¹ããªã®ã§ãããä¸ã«æãåºããããã«é çªãéã«ãªã£ã¦ãã¾ãã
'spells': ['non-directional laser', 'star-dust reverie'] 'spells': ['star-dust reverie', 'non-directional laser']
ãããå«ããåä¸ã®ç©ã¨ãã¦å¤å®ãããã§ãã
ã¡ãªã¿ã«assertCountEqualã使ãã¨ã以ä¸ã®ããã«ç°ãªãè¦ç´ ã ã¨å¤å®ããã¦ãã¾ãã¾ãã
>>> case.assertCountEqual(dict_in_list1, dict_in_list2) AssertionError: Element counts were not equal: First has 1, Second has 0: {'name': 'Marisa', 'spells': ['non-directional laser', 'star-dust reverie']} First has 0, Second has 1: {'name': 'Marisa', 'spells': ['star-dust reverie', 'non-directional laser']}
ãã®ãããªã±ã¼ã¹ã§ãåããªãã¸ã§ã¯ãã ã¨ä¸çºã§å¤å®ã§ãããµã¼ããã¼ãã£è£½ã®ã©ã¤ãã©ãªãããã¾ãããããdeepdiffã§ãã
æ¬æ¥ãã£ã¨å¤æ©è½ãªã®ã§ãããä»åã¯ãã¹ãã¨ãã観ç¹ã®ã¿ããè¨è¿°ãã¾ã.
å°å ¥ã¯pipã§ç°¡åã«è¡ãã¾ã.
$ pip install deepdiff
ä»åã®ã±ã¼ã¹(ãªã¹ãé¨åã®é çªãç¡è¦ãã¦åä¸ããå¤å®)ã§ã®ä½¿ç¨ã¯ã以ä¸ã®ããã«ignore_order=True
ã¨ãã¦ãããªãã¾ãã
DeepDiff(dict_in_list1, dict_in_list2, ignore_order=True) {} # 空ã®deepdiff.diff.DeepDiffãªãã¸ã§ã¯ããè¿ã£ã¦ãã.
ãã¨ã¯åé ã§ç¤ºããããã«ãassertæãassertã¡ã½ããã§å¤å®ããã°OKã§ãã
ãªããDeepDiffã¯å·®åãããå ´åã«ã¯ãã©ã®ãã¼ã®ã©ã®è¦ç´ ããã©ããªãµãã«ç°ãªã£ã¦ãããã示ãã¦ããã¾ãã
>>> dict_in_list3 = [ {"name": "Marisa", "spells": ["star-dust reverie", "non-directional laser"]}, {"name": "Reimu", "spells": ["niju-kekkai"]}, ] >>> DeepDiff(dict_in_list1, dict_in_list3, ignore_order=True) { 'iterable_item_removed': { "root[0]['spells'][0]": 'Musouhuin', 'root[2]': { 'name': 'Alice', 'spells': ['hourai-doll', 'shanghai-doll'] } } }
dict_in_list3ã§åé¤ãããæ å ±ããé層æ å ±ã¨ã¨ãã«è¡¨ç¤ºããã¾ãã.
çµè«
- å¯è½ãªéãsortedã§ã½ã¼ããã¦æ¯è¼ãã.
- sortæ¡ä»¶ãè¤éã«ãªããªããassertCountEqualã¡ã½ããã®ä½¿ç¨ãæ¤è¨ãã.
- ãã£ã¨é£ããç¶æ³ã§ã¯DeepDiffãignore_order=Trueã¨ãã¦ä½¿ããã¨ãã§ãã.
ã¾ã¨ã
é çªã«ããããåãè¦ç´ ãæã¤ãªã¹ãã§ããããæ¤è¨¼ããæ¹æ³ã3ã¤ç´¹ä»ãã¾ããã
ãã¡ãããããããæ¯è¼ãã«ãããã®ãæ¯è¼ããã«æ¸ããã¹ããæ¸ãããªããã®æ¹ãããã§ãããã¾ãä¹±ç¨ããã¨ãè¿ã£ã¦èªã¿è¾ããã¹ãã«ãªã£ã¦ãã¾ãããããã¾ããã
ãããç¹å®ã®æèã§ã¯ãä»åç´¹ä»ãããããªæ¹æ³ã試ãã®ãé¸æè¢ã«å ¥ãã¦ãè¯ãã®ã§ã¯ãªãã§ãããã.
ä»ã«ãããæ¹æ³ãèªåãªããããããï¼ãªã©ã®ãæè¦ããã ããã¨å¬ããã§ãã
ãPythonãã³ãããå·®åã®ã¿blackã§æ´å½¢ãã ãdarkerã
åæ¸ã
ã³ã¼ãã®æ´å½¢ã¯ãã©ã¼ããã¿ã«ä»»ããããã®ã§ãã
çæ³çã«ã¯ãå ¨å¡ãåãã¹ã¿ã¤ã«ã§ã³ã¼ããæ´å½¢ã§ããããã«pre-commitãªã©ãå©ç¨ãã¦ã³ãããæã«ãã©ã¼ããã¿ãèªåå®è¡ãã¾ãã
ãããããã¸ã§ã¯ãã®éä¸åå ãªã©ãå°å ¥ãé£ããã±ã¼ã¹ãããã¨æãã¾ãã
ä»åã®èªåã¯ãããã¸ã§ã¯ãã«ãã©ã¼ããã¿ãå°å ¥ãããã¦ããã
- ããã¦èªåã®ã³ãããåã ãã¯blackã§æ´å½¢ãããã
- å ±éã©ã¤ãã©ãªãæ´æ°æããã¡ã¤ã«åä½ã§ã¯ãªãè¡åä½ã§æ´å½¢ãããã
ã¨ããç¶æ³ã§ããã æ ¹æ¬è§£æ±ºã諦ãã次åã®æ¹æ³ã¯ãªããã¨èª¿ã¹ãã¨ãã darkerã¨ãããblack(ã¨isort)ã®wrapperã©ã¤ãã©ãªãè¯ãããã§ããã
(ãªããDarkerä½è ã¯GitHubã®READMEã«ãæ¬å®¶ã®blackã«ãè¡åä½ã®ãã©ã¼ãããæ©è½ã¯å°æ¥å°å ¥ããããã ã¨è¨åãã¦ãã¾ã.)
darkerã«ã¤ãã¦ã¯æ¥æ¬èªã®æ å ±ãå°ãªãããã ã£ãã®ã§ã試ãã¦ã¿ãå 容ãã¾ã¨ãã¦ãããã¨æãã¾ãã
åèãªã³ã¯
ç°å¢
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Big Sur | 11.6 |
Python3 | 3.10.2 |
darker | 1.3.2 |
black | 21.12b0 |
Pygments | 2.11.2 |
darkerã®ã¤ã³ã¹ãã¼ã«
ã¤ã³ã¹ãã¼ã«
ä»®æ³ç°å¢ãä½ã£ã¦pip installãã¾ãã
$ python3.10 -m venv env $ source env/bin/activate
(env)$ pip install darker
注æ: blackã®ãã¼ã¸ã§ã³ãä¸ããå¿
è¦ããã.(2022-02-05) ä¿®æ£ããã¦ãã¾ã.
(2022-03-04)追è¨: ç¾å¨ã¯ä¿®æ£ããã¦ãã¾ãããã®é ç®ã¯ä¸è¦ã§ãããè¨é²ã¨ãã¦æ®ãã¦ããã¾ãã
2022-02-05 ç¾å¨ããã®ã¾ã¾ã ã¨darkerãå©ç¨ã§ãã¾ããã
darkerã¯blackã®wrapperãªã®ã§ãdarkerãã¤ã³ã¹ãã¼ã«ããã¨ææ°ã®blackãä¸ç·ã«è½ã¨ããã¾ããããããææ°ã®black(ã¤ãã«Î²ãåãã22.1.0
)ã¯def find_project_root
é¢æ°ã®è¿ãå¤ã®åãPathããtupleã«å¤ãã£ã¦ãã¾ããdarerã¯æªå¯¾å¿ã§ãã
ãã¡ãã®ãã«ãªã¯ã§ãæ¢ã«darkerã®ä½è (akaiholaãã)ãä¿®æ£ä¸ã®ããã§ãã 追è¨: ->ããã§ã«ä¿®æ£ããã¦ãã¾ãã
ã¨ããããã¯ãblackã®ãã¼ã¸ã§ã³ãβçã¾ã§è½ã¨ãã°è¯ãã§ãã
(ä½è«ã§ããã以ä¸ã®ããã«ãã¦å®è¡ããã¨installå¯è½ãã¼ã¸ã§ã³ãè¦ããã¨ãã§ãã¦ä¾¿å©ã§ã.(å¤åæ£å½ãªæ¹æ³ã§ã¯ãªãã§ãã...))
(env)$ pip install black== ............................(çç¥)........................... 20.8b1, 21.4b0, 21.4b1, 21.4b2, 21.5b0, 21.5b1, 21.5b2, 21.6b0, 21.7b0, 21.8b0, 21.9b0, 21.10b0, 21.11b0, 21.11b1, 21.12b0, 22.1.0) ERROR: No matching distribution found for black==
ææ°ã®ä¸ååã¯21.12b0
ã§ãã®ã§ããã¡ãã«ãã¦ã³ã°ã¬ã¼ããã¦ããã¾ãã
(env)$ pip install black==21.12b0
Pygmentsã§è²ãã¤ãã
ããã²ã¨æéå ãã¦ãåºåçµæã®è¦ãç®ããããã«ãã¦ããã¾ãã
Pygments ãåãç°å¢ã«ã¤ã³ã¹ãã¼ã«ãã¦ããã¨ãdarkerã¯åºåçµæãã«ã©ã¼ã«ãã¦ããã¾ãã
(env)$ pip install Pygments==2.11.2
ç¹ã«è¨å®ã¯å¿ è¦ããã¾ããã
darkerã®åºåçµæã¯ãå ã ãã®ãããªè¦ãç®ã§ããã
ãã®ããã«å¤ããã¾ãã
使ãæ¹
æ°è¦å·®åãæ´å½¢
ã¾ããdarkerã¯git diff
ãå©ç¨ããã®ã§ããã£ã¬ã¯ããªãgit 管çä¸ã«ããå¿
è¦ãããã¾ãã
é©å½ãªãã£ã¬ã¯ããªãä½æããæåã®ã³ãããã¾ã§æ¸ã¾ãã¦ããã¾ãã(å 容ã¯ãªãã§ãOKã§ã.)
$ git init $ touch README.md $ git add README.md $ git commit -m "first"
ããã§HEADãã§ããã®ã§ãdarkerãå©ç¨ã§ãã¾ãã以ä¸ã®ãããªpythonãã¡ã¤ã«(darker_test.py
)ãä½æãã¾ãã
(ã¨ã«ãã横ã«é·ããããã£ãã ããªã®ã§ãé©å½ã§ã)
def format_name_and_age_to_profile(name: str | None, age: int | None, address: str | None): return f"{name} -- {age} -- {address}"
ä¸æ¦ãããã§ã³ããããã¾ã. darkerããã³ãããå·®åãã«å¹ããã¨ãæ¤è¨¼ãããããã§ãã
$ git add darker_test.py $ git commit -m "ä¸ã¤ç®ã®é¢æ°" $ git log --oneline 7e25910 (HEAD -> master) ä¸ã¤ç®ã®é¢æ° b6bee90 first
ã§ã¯ãdarker_test.pyã«ããä¸ã¤è¨è¿°ãå ãã¦ä»¥ä¸ã®ããã«ãã¾ãã
def format_name_and_age_to_profile(name: str | None, age: int | None, address: str | None): return f"{name} -- {age} -- {address}" def format_name_and_age_to_profile_version_2(name: str | None, age: int | None, address: str | None): return f"{name} -- {age} -- {address}" # 2åç®ã®ã³ããã
ã¾ã ã³ãããã¯ããªãã§ãã ããï¼ (addã¾ã§ã¯OKã§ã.)
ãã㧠darkerã§ä¿®æ£å·®åãåºåãã¦ã¿ã¾ã.
ä»åã¯ä»¥ä¸ã®ããã«ãã«ã¬ã³ããã£ã¬ã¯ããª(.
)ããã¡ã¤ã«ãç´æ¥æå®ãã¾ããã©ã¡ãã§ãçµæã¯å¤ããã¾ãã.
$ darker --diff . # ã«ã¬ã³ããã£ã¬ã¯ã㪠$ darker --diff darker_test.py # ãã¡ã¤ã«æå®.
ä»åã®å¤æ´åã ããæ´å½¢ããã¦ãããã¨ããããã¾ãã æåã«ã³ãããæ¸ã¿ã®ãä¸ã¤ç®ã®é¢æ°ã¯æ´å½¢ããã¦ãã¾ããã
ã¾ãããã®æç¹ã§ã¯å
ãã¡ã¤ã«ã¯å¤æ´ããã¦ãã¾ãããä¿®æ£ããã¡ã¤ã«ã«åæ ããããã°ã--diff
ãªãã·ã§ã³ãå¤ãã¾ãã
$ darker .
æ´å½¢å¾ã¯ä»¥ä¸ã®ããã«ãªãã¾ããäºã¤ç®ã®é¢æ°(ã¾ã ã³ããããã¦ããªãå)ã ãããæ´å½¢ããã¦ãã¾ãã
def format_name_and_age_to_profile(name: str | None, age: int | None, address: str | None): return f"{name} -- {age} -- {address}" def format_name_and_age_to_profile_version_2( name: str | None, age: int | None, address: str | None ): return f"{name} -- {age} -- {address}" # 2åç®ã®ã³ããã
ã§ã¯ä¸æ¦ã³ããããã¦ããã¾ãããã
$ git add . $ git commit -m "äºã¤ç®ã®é¢æ°(æ´å½¢æ¸ã¿)"
ã³ããããæå®ãã¦æ´å½¢
darkerã¯git-diffãå©ç¨ãã¦ããã®ã§ãä¾ãã°ããã³ãããããããã³ãããã¾ã§ã¨ããç¯å²ãæå®ãããã®æã®å¤æ´ãã¿ã¼ã²ããã«æ´å½¢ãå¯è½ã§ãã
å ç¨ã®ã³ããããã°ã¯ä»¥ä¸ã®ããã«ãªã£ã¦ãã¾ãã
$ git log --oneline 8de1f64 (HEAD -> master) äºã¤ç®ã®é¢æ°(æ´å½¢æ¸ã¿) 7e25910 ä¸ã¤ç®ã®é¢æ° b6bee90 first
ä¸ã¤ç®ã®é¢æ°ã¯æ´å½¢ã§ãã¦ããªãã£ãã®ã§ããã¡ããæå®ãã¦æ´å½¢ãè¡ãã¾ããfirstã³ããã(b6bee90)ãã7e25910ã®éã®å¤æ´ãªã®ã§ä»¥ä¸ã®ããã«æå®ãã¾ãã
æå¾ã«PATHãæå®ããå¿
è¦ãããã®ã§ãããç´æ¥ãã¡ã¤ã«å(darker_test.py
)ãæå®ãããã以ä¸ã®ããã«ã¯ã¤ã«ãã«ã¼ãã使ãå¿
è¦ãããã¾ããã
.
(ã«ã¬ã³ããã£ã¬ã¯ããª)æå®ã¯ããªããã§ããªãä»æ§ã®ããã§ããã
$ darker --diff --revision b6bee90..7e25910 * # *.pyãdarker_test.pyã§ã大ä¸å¤«.
ã¾ãhelpãè¦ãã¨ãã³ãããéã¯...(ãããä¸ã¤)ã§åºåãããã«æ¸ãã¦ããã¾ããã(2ã¤ã§ãã§ãã¾ãããéãã¯ä¸æã§ã)
çµæã¯ãæå®ããã³ãããéã§è¨è¿°ããä¸ã¤ç®ã®é¢æ°ãæ´å½¢ããã¦ãã¾ãã
pre-commitã§darkerã使ã
ã¾ãpre-commitã®å°å ¥ã§ãã
å ¬å¼ãã¼ã¸ãã¨ã¦ãããããããã§ãã
(env)$ pip install pre-commit (env)$ pre-commit --version pre-commit 2.17.0
.pre-commit-config.yaml
ã Darkerã®å
¬å¼GitHubã«ããåèä¾ãå
ã«è¨è¿°ãã¾ãã
blackã®ãã¼ã¸ã§ã³ãè½ã¨ãã¦ãã ãã (2022-02-06ç¾å¨. çç±ã¯åè¿°ã®éã.)
repos: - repo: https://github.com/akaihola/darker rev: 1.3.2 hooks: - id: darker additional_dependencies: [black==21.12b0] # ææ°blackã ã¨å¤±æãã.
$ pre-commit install
ã§ã¯darkerã«ã¾ãé·ãååã®é¢æ°ãï¼è¡ã§æ¸ããå®è¡ãã¦ã¿ã¾ãã
ãã¡ãã追è¨ã...
def format_name_and_age_to_profile3(name: str | None, age: int | None, address: str | None): return f"{name} -- {age} -- {address}"
(env)$ git add . (env)$ git commit
çµæãã³ããããã¦ããªãå·®åã®ã¿ããã©ã¼ãããããã¦ããã¯ãã§ãã
ããããèªåã ãã³ã¼ãæ´å½¢ãããã¨ã«æå³ã¯ãããï¼
ååã®æ¹ã«ãç¸è«ããã¦ããã ããã®ã§ããã
- ããããèªåãã©ã¼ãããã¯ã¬ãã¥ã¼ã®è² æ 軽æ¸ã®ããã«ãã£ã¦ãã.
- ãã¼ã ã§ã³ã¼ããçµ±ä¸ãããã¨ãç®ç.
ã¨ããææãããã ãã¾ããã
å ¨ããã®éãã§ãæ¬å½ã¯ããã¸ã§ã¯ãå ¨ä½ã§è¨å®ãã¡ã¤ã«ãå ±æããpre-commitã§black/isort/flake8/mypyããããããããã¨ãèªåãã©ã¼ãããã®ç®çã«å³ãã¦ããã¨æãã¾ãã
èªåã®å ´åã
- æ¢åã®pythonã³ã¼ãã¯ãã©ã¼ããã¿ãªã©ãé©å¿ãã¦ããªããã¾ããã¯ãªã¼ããªã©ã®ã¹ã¿ã¤ã«ã¯ãã©ãã©ã
- æ°è¦ã«ã³ã¼ãã追å ããã®ã¯åºæ¬çã«èªåã ããã¬ãã¥ã¼ã¯åãã.
- æ¢åã®pythonã¹ã¯ãªããããèªåã®utilityé¢æ°ã大éã«å ¥ã£ãã©ã¤ãã©ãªãããããã®éãããã
ã¨ããç¶æ³ã§ããã
ãã®ãããèªåèªèº«ãæ¸ããç¯å²ã®ã³ã¼ããèªã¿ãããããããæ¸ãã¨ãã«ã¯ä½è¨ãªãã¨ãèããªãã¦æ¸ãããã«ãã¯ããã©ã¼ããã¿ã¯æ¬²ããã¨æãã¾ããããã®ãããæ¬æ¥ã®ç®çã¨ã¯å°ãé¢ãã¦ãã¾ããã¨ã念é ã«ç½®ããä¸æãã®ãçã«darkerãå©ç¨ãããã¨æã£ã¦ãã¾ãã
ãã¡ãããããã¯æ ¹æ¬è§£æ±ºã«ã¯ãªããªããã¨ãããã¨ã常ã«å¿ããªãããã«ãããã¨æãã¾ãã
ã¾ã¨ã
- å¯è½ãªãããã¸ã§ã¯ãç«ã¡ä¸ãæã«pre-commitãè¨å®ãã¦ãããæ¹ãããã
- èªåãã©ã¼ãããã®ç®çã¯ä½ãè¦å¤±ããªãããã«ããã
- ããã§ãã³ãããå·®åã ãããã©ã¼ãããããããªããdarkerã¯é¸æè¢ã«å ¥ã£ã¦ããã
ãæè¦ããææãªã©ããã ããã¨å¬ããã§ã:pray:
ãPythonããã¹ãæã«ããã©ã«ãå¼æ°ã®å¤ãå·®ãæ¿ãã
- åæ¸ã
- åèãªã³ã¯
- ç°å¢
- åç½®ã: ãã¹ã対象 ãªãã©ããªãã©ã¤ããé¢æ°
- ãã¹ã: 失æãããã¹ãã«æéãããã
- æ¹æ³0. ãã¹ããåãã
- æ¹æ³1. __defaults__ãæ¸ãæãã
- æ¹æ³2. partialã使ã£ã¦ããã©ã«ãå¼æ°ãæ¸ãæãã
- ã¾ã¨ã
åæ¸ã
ãã®è¨äºã¯
Calendar for JSL(日本システム技研) | Advent Calendar 2021 - Qiita
ã®12/13(æ)ã®è¨äºã§ãã
Pythonã®ãã¹ãã³ã¼ããæ¸ãæã«ãããã©ã«ãå¼æ°ãå¤æ´ãããã±ã¼ã¹ãããã¾ãã
ãã¨ãã°å¤±ææã«ãªãã©ã¤ãç¹°ãè¿ããããã«ã¼ããä½å¨ãããé¢æ°ã®æåã確èªããéã«ã¯1,2åç¹°ãè¿ãã°ååã§ãã
é¢æ°å¼ã³åºãæã«å¤ãæå®ãã¦ããå ´åã¯ç°¡åã«mockã§ãã¾ãããããã©ã«ãå¼æ°ã®å¤ããã®ã¾ã¾å©ç¨ããã±ã¼ã¹ãã¤ã¾ãé¢æ°å¼ã³åºãæã«å¼æ°ãæå®ãã¦ããªãå ´åã«ã¯é£ããã§ãã
ãã®è¨äºã§ã¯ãã®æ§ãªã±ã¼ã¹ã¸ã®å¯¾å¦æ³ãç´¹ä»ãã¾ãããããã£ã¨ããæ¹æ³ããããï¼ãã¨ããããããããããæ¹ããããï¼ãã¨ãããæè¦ãããã°ãã³ã¡ã³ãã§ã¢ããã¤ã¹ããã ããã¨å¬ããã§ãã
åèãªã³ã¯
unittest --- ユニットテストフレームワーク — Python 3.10.0b2 ドキュメント
unittest.mock --- モックオブジェクトライブラリ — Python 3.10.0b2 ドキュメント
functools --- 高階関数と呼び出し可能オブジェクトの操作 — Python 3.10.0b2 ドキュメント
ç°å¢
ãã¼ã¸ã§ã³ | |
---|---|
MacOS Big Sur | 11.6 |
Python3 | 3.9.1 |
requests | 2.25.1 |
åç½®ã: ãã¹ã対象 ãªãã©ããªãã©ã¤ããé¢æ°
ä»åãã¹ãããé¢æ°ãè¦ã¦ããã¾ãã
ããã¾ã§ãµã³ãã«ãªã®ã§ãå®ç¨ã³ã¼ãã¨ãã¦ã¯ä¸ååã§ãããæ°ãã¤ããã ããã
main.py
import requests from requests.exceptions import ConnectionError def request_with_retry(url, retry=10): for i in range(retry): try: result = requests.get(url) except ConnectionError as e: print(f"失æ{i}åç®") time.sleep(1) else: return result.content def show_result(url): content = request_with_retry(url) # ããã©ã«ãã®retryæ°ã§å®è¡ if content: return f"------- {content} ----------" else: return 'çµæãåå¾ã§ããªãã£ã'
requests_with_retryé¢æ°
- 渡ããURLã«GETãªã¯ã¨ã¹ããæããçµæã®ããã¹ããè¿ãé¢æ°ã§ãã失æãããNoneãè¿ãã¾ãã
- 対象URLã«æ¥ç¶ã§ããªãã¦ããretryå¼æ°ã§æ¸¡ããåæ°ã ããªãã©ã¤ã試ã¿ã¾ãã
- ããã©ã«ãå¤ã¯10åã§ãã
show_result
- ãã¹ã対象ã®é¢æ°ã§ãã
- å
é¨ã§
requests_with_retry
é¢æ°ããªãã©ã¤æ°ãæå®ãã callãããã®çµæãå©ç¨ãã¦ãã¾ãã
ã§ã¯å®éã«åãããæåã確ããã¦ããã¾ãããã
IPythonãã¤ãã£ã¦ã対話ã¢ã¼ãã«å ¥ã£ã¦ãã¾ãã
import main main.show_result("http://localhost:5000")
ãã¼ã«ã«ã®5000ãã¼ãã§ã¯ä½ããããã¦ããªãããã以ä¸ã®æ§ã«è¡¨ç¤ºããã¾ãã
ãªãã©ã¤ãããã³ã«1ç§ã¹ãªã¼ããã¦ãããããå®è¡ã«æéããããã¾ã(gifãªã®ã§å®éããé«éã«è¦ãã¾ãã)ã
ãã¹ã: 失æãããã¹ãã«æéãããã
ãã¦ãããã§ã¯show_result
é¢æ°ãçµæãåå¾ã§ããªãã±ã¼ã¹ã®ãã¹ããæ¸ãã¦ããã¾ãã
å®è£ ãã¿ãã¨ã失ææã«ã¯ã'çµæãåå¾ã§ããªãã£ã'ãã¨ããæååãè¿ãã®ã§ããã
main.py
ã«ç´æ¥ãã¹ããæ¸ãã¦ããã¾ããä»åã¯unittestã使ãã¾ãã
https://docs.python.org/ja/3/library/unittest.html
from unittest import TestCase class TestsMyFuncs(TestCase): def test_show_result_fail(self): url = 'http://localhost:5000' actual = show_result(url) self.assertEqual(actual, "çµæãåå¾ã§ããªãã£ã")
ãã¡ããå®è¡ããã¨ã以ä¸ã®æ§ã«ãªãã¾ãã
ã¨ã¦ãæéãããã£ã¦ãã¾ã.
åç´ã«æéç縮ãããªãsleepãçãããã®ãè¯ãããããã¾ããããæ ¹æ¬çãªåé¡ã¯ãªãã©ã¤åæ°ã®ããã©ã«ãå¤ã大ãããããã¨ã ã¨æãã¾ãã
ã¨ããããã§ãä»åã¯ãã®ããã©ã«ãå¤ãå¤æ´ããæ¹åã§èãã¦ããã¾ãã
æ¹æ³0. ãã¹ããåãã
ãããªãã§ããè±ç·ãã¾ãã
ããããä»åã®ã±ã¼ã¹ã¯åç´ãªã®ã§ããã¹ããåå²ããã°è§£æ±ºãã åé¡ã ã¨æãã¾ãã
show_result
ã¯ãrequest_with_retryãå¼ã³ãçµæãNoneãããã¹ããã«ãã£ã¦å¥ã®æååãè¿ããé¢æ°ã ã¨ããã¾ãã
ããããã¨ãURLâçµæããè¡ãã®ã¯request_with_retry
é¢æ°ã§ããããã®ãã¹ãã§ãã§ãã¯ããå¿
è¦ã¯ãªãã§ãã丸ãã¨ã¢ãã¯ãã¦ãã¾ã£ã¦ãè¯ã ã§ãããã
from unittest import TestCase, mock class TestsMyFuncs(TestCase): def test_show_result_fail(self): url = 'http://localhost:5000' with mock.patch('main.request_with_retry', return_value=None): actual = show_result(url) self.assertEqual(actual, "çµæãåå¾ã§ããªãã£ã")
ãã¹ããå®è¡ããçµæã§ãã
(env) python/tmp $ python -m unittest main.TestsMyFuncs . ---------------------------------------------------------------------- Ran 1 test in 0.001s OK
request_with_retry
ã®æ©è½(失æãããNone,ãæåãããcontentãè¿ã)ã¯ãå¥ã®ãã¹ãã§ç¢ºèªããã°è¯ãã§ãããã
å¼ã³åºãæã«ãªãã©ã¤åæ°ã2åã«å¶éãã¾ãã
from unittest import TestCase, mock from requests.exceptions import ConnectionError class TestsMyFuncs(TestCase): # .......çç¥............ def test_request_with_retry(self): url = 'http://localhost:5000' with mock.patch.object(requests, 'get', side_effect=ConnectionError): actual = request_with_retry(url, retry=2) # ããã§ãªãã©ã¤æ°å¤æ´ self.assertIsNone(actual)
ãã¹ããå®è¡ããã¨ããªãã©ã¤ã2åã ãç¹°ãè¿ãã¦ãã¾ãã
(env) python/tmp $ python -m unittest main.TestsMyFuncs.test_request_with_retry 失æ0åç® å¤±æ1åç® . ---------------------------------------------------------------------- Ran 1 test in 2.002s OK
ã¦ããããã¹ãã¯å¯è½ãªéãåå²ããæ¹ãå½¹ã«ç«ã¤ããå¤æ´ã«ãå¼·ããªãã¨æãã¾ãã
æ¹æ³1. __defaults__
ãæ¸ãæãã
ãæ¹æ³0. ãã¹ããåããããå¯è½ãªãè¯ãã®ã§ãããããã¯è¨ã£ã¦ãããªãã±ã¼ã¹ãããã¨æãã¾ãã
ã¾ã㯠ããã©ã«ãå¼æ°ã®å¤èªä½ãæ¸ãæãã ãã¨ã«ãã¾ãã
ã¾ãã¯ä¸è¨ãªã³ã¯ã§ç´¹ä»ããã¦ããã__defaults__
ãç½®ãæããæ¹æ³ã§ãã
ä¸è¿°ã®ipythonã§å¯¾è©±ã¢ã¼ãã«å ¥ãã¾ãã
>>> import main >>> main.request_with_retry <function main.request_with_retry(url, retry=10)> >>> main.request_with_retry.__defaults__ (10,)
ãã®æ§ã«ãpythonã®Functionã¯__defaults__
ã¨ããå±æ§ã«ããã©ã«ãå¼æ°ã§è¨å®ããå¤ãæã£ã¦ãã¾ãã
ãã¡ããå·®ãæ¿ãããã¨ã§ãªãã©ã¤æ°ãå¤æ´ãã¦ããã¾ãã
å·®ãæ¿ãã«ã¯mock.patch.object
ã使ãã¨ä¾¿å©ã ã¨æãã¾ãã
unittest.mock --- ã¢ãã¯ãªãã¸ã§ã¯ãã©ã¤ãã©ãª â Python 3.10.0b2 ããã¥ã¡ã³ã
class TestsMyFuncs(TestCase): def test_show_result_fail(self): url = 'http://localhost:5000' with mock.patch.object(request_with_retry, '__defaults__', (2, )): # 2å actual = show_result(url) self.assertEqual(actual, "çµæãåå¾ã§ããªãã£ã")
ãã¹ããå®è¡ããã¨ããªãã©ã¤ã2åã ãç¹°ãè¿ãã¦ãã¾ãã
(env) python/tmp $ python -m unittest main.TestsMyFuncs 失æ0åç® å¤±æ1åç® . ---------------------------------------------------------------------- Ran 1 test in 2.021s OK
æ¹æ³2. partialã使ã£ã¦ããã©ã«ãå¼æ°ãæ¸ãæãã
functoolsã¢ã¸ã¥ã¼ã«ã®partial
é¢æ°ã使ãæ¹æ³ã§ãã
ãã¡ãã¯é¢æ°ã®å¼æ°ã«å¤ã渡ãã¦ãæ°ããé¢æ°ãä½ããããªãã¨ãã§ãããã¹ã以å¤ã§ãå½¹ã«ç«ã¡ã¾ãã
å®éã«ã¿ãæ¹ãæ©ãã®ã§ãã¾ãã¾ãipythonã«å ¥ã£ã¦ããã¾ãã
>>> import main >>> from functools import partial >>> modified = partial(main.request_with_retry, retry=1) # ãªãã©ã¤æ°ã1ã«åºå®ã >>> modified('http://localhost:5000') 失æ0åç® # ããã§çµäºãã¦ããã
以ä¸ã®æ§ã«ãªãã¾ããmodifiedã¯ãrequest_with_retryé¢æ°ã®retryã1ã«åºå®ããæ°ããé¢æ°...ã¨ããã¤ã¡ã¼ã¸ã§ãã
ã§ã¯ããã¹ãã³ã¼ããå¤æ´ãã¦ããã¾ãã
from functools import partial class TestsMyFuncs(TestCase): def test_show_result_fail(self): url = 'http://localhost:5000' with mock.patch('main.request_with_retry', side_effect=partial(request_with_retry, retry=2)): actual = show_result(url) self.assertEqual(actual, "çµæãåå¾ã§ããªãã£ã")
ãã¹ããå®è¡ããã¨ããªãã©ã¤ã2åã ãç¹°ãè¿ãã¦ãã¾ãã
(env) python/tmp $ python -m unittest main.TestsMyFuncs 失æ0åç® å¤±æ1åç® . ---------------------------------------------------------------------- Ran 1 test in 2.026s OK
èªåã¯ãã¡ãã®æ¹æ³ããã使ãã¾ãã__defaults__
ã®æ¸ãæ¹ã¨æ¯ã¹ã¦ã
- èªã¿ãããã
- partialã¯è¦ãç®ãããã¦ãretry=2ã«æ¸ãæãã¦ãããã¨ããã®ãããããããã¨æãã¾ãã
__defaults__
ã®æ¹ã¯ã³ã¡ã³ããå¿ è¦ã§ããããretry
ã¨ãããã©ã¡ã¼ã¿ã¨2
ãçµã³ã¤ãã¾ããã
- 調ã¹ãããã
- partialã®Docstringãèªãã°æ¦è¦ããããã¾ãã
__defaults__
ã®Docstringãããæåãç解ã§ããã¨ã¯æãã¾ããã
ã¾ã¨ã
ãã¹ããåå²ã§ããã°è¯ãã§ãããæéã®å¶ç´ãå®è£ ãå¤æ´ã§ããªããªã©ããããã©ã«ãå¼æ°ã®å¤ãå¤æ´ãããã±ã¼ã¹ãæã åºã¦ãã¾ãã
èªåã¨ãã¦ã¯partialãããããã§ãã
åæ¸ãã«ãæ¸ãã¾ãããã
- ãã£ã¨è¯ãæ¹æ³
- ããããè«
ãããã°ã³ã¡ã³ãããã ããã¨å¬ããã§ãï¼