Human Generated Data

Title

Untitled (three sets of twins posed outside on grass lawn)

Date

1952-1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6580

Human Generated Data

Title

Untitled (three sets of twins posed outside on grass lawn)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6580

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Person 99.8
Human 99.8
Person 99.7
Clothing 99.7
Apparel 99.7
Person 99.6
Person 99.6
Person 99.4
Person 99.4
Boy 97.2
Female 92
Face 90.2
Dress 87.9
Pants 87.9
Smile 82
People 80.3
Child 79.8
Kid 79.8
Girl 77.1
Outdoors 74.2
Shorts 72.9
Sailor Suit 71.8
Woman 71.3
Portrait 65.6
Photo 65.6
Photography 65.6
Leisure Activities 63.4
Teen 58.8
Sleeve 58.6
Shirt 56.6
Coat 56.6

Clarifai
created on 2019-03-26

people 99.9
group together 99.1
adult 98.8
group 97.7
man 96.8
wear 95.8
woman 94.7
four 91.5
several 90.9
leader 90.7
child 90.3
administration 89.4
portrait 89.3
three 89.2
many 88.5
five 88
two 87.2
facial expression 83
offspring 82.2
outfit 81

Imagga
created on 2019-03-26

kin 76.2
people 24.5
man 22.8
person 19.2
male 16.4
black 15.6
silhouette 14.1
world 13.1
travel 12.7
outdoor 12.2
couple 12.2
sport 11.9
vacation 11.4
men 11.2
adult 11.1
happiness 11
player 10.8
tourism 10.7
ballplayer 10.4
beach 10.1
athlete 9.9
family 9.8
summer 9.6
walking 9.5
happy 9.4
water 9.3
tradition 9.2
park 9.1
old 9.1
sunset 9
religion 9
outdoors 9
boy 8.7
love 8.7
child 8.6
play 8.6
youth 8.5
art 8.5
relationship 8.4
portrait 8.4
sky 8.3
dress 8.1
trees 8
together 7.9
bride 7.7
mask 7.7
two 7.6
vintage 7.4
peaceful 7.3
business 7.3
tourist 7.2
contestant 7.2
recreation 7.2
spring 7.1
businessman 7.1

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

person 99.6
outdoor 98.8
posing 61.5
sport 20.8
black and white 16.2
child 9
baseball 8.5
boy 7.7
street 6.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 89.9%
Sad 9.6%
Disgusted 0.7%
Happy 1%
Angry 0.9%
Confused 1.2%
Calm 84.8%
Surprised 1.9%

AWS Rekognition

Age 30-47
Gender Male, 50.2%
Sad 47.6%
Angry 45.4%
Disgusted 45.2%
Calm 50.1%
Surprised 45.7%
Confused 45.4%
Happy 45.6%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Calm 48%
Disgusted 45.2%
Sad 45.6%
Confused 45.5%
Happy 48.4%
Surprised 46.1%
Angry 46.3%

AWS Rekognition

Age 30-47
Gender Male, 53%
Surprised 45.2%
Sad 48.7%
Calm 50.6%
Confused 45.1%
Disgusted 45.1%
Happy 45.2%
Angry 45.2%

AWS Rekognition

Age 20-38
Gender Female, 97.9%
Surprised 4.3%
Sad 24.1%
Calm 3.6%
Confused 2.7%
Disgusted 2.9%
Happy 46.9%
Angry 15.5%

AWS Rekognition

Age 14-23
Gender Female, 50.3%
Happy 45.3%
Calm 49.5%
Sad 49%
Angry 45.4%
Surprised 45.3%
Disgusted 45.3%
Confused 45.2%

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Amazon

VTO