Human Generated Data

Title

Untitled (four girls in tutus)

Date

1949

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2040

Human Generated Data

Title

Untitled (four girls in tutus)

People

Artist: Hamblin Studio, American active 1930s

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2040

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 99.5
Person 99.3
Person 98.8
Dance 98.7
Ballet 97.6
Ballerina 95.7
Shoe 92.7
Clothing 92.7
Footwear 92.7
Apparel 92.7

Clarifai
created on 2023-10-25

people 99.9
dancer 98.7
dancing 98.1
adult 97.9
man 97.3
group 96.1
woman 94.2
ballerina 94.1
wear 93
tutu 92.8
child 92.1
print 90.5
dress 90.1
three 87
ballet 87
ballet dancer 86.9
princess 86.1
retro 85.3
portrait 84.4
couple 83.4

Imagga
created on 2021-12-14

dancer 30.9
performer 24.3
dance 21.7
beach 19.8
people 19.5
entertainer 18.5
person 17.7
art 17.4
summer 17.3
sea 17.2
male 17
sand 17
man 16.8
bench 16.6
silhouette 16.5
park bench 16.1
water 16
couple 15.7
travel 14.8
sky 14.7
sunset 14.4
sport 13.5
ocean 13.3
musical instrument 13.1
vacation 13.1
sun 12.9
creation 12.5
outdoor 12.2
outdoors 11.9
scene 11.2
athlete 11.2
shore 11.1
lifestyle 10.8
building 10.5
seat 10.2
horse 10.2
romantic 9.8
walking 9.5
love 9.5
adult 9.3
tourist 9.2
city 9.1
tourism 9.1
old 9
landscape 8.9
life 8.9
happiness 8.6
holiday 8.6
men 8.6
tree 8.5
evening 8.4
runner 8.4
park 8.2
wind instrument 8.2
romance 8
together 7.9
day 7.8
sunny 7.7
two 7.6
relax 7.6
happy 7.5
dark 7.5
street 7.4
group 7.2
portrait 7.1
family 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

clothing 90.2
footwear 89.7
black and white 87.5
outdoor 87.4
person 84.8
text 79.9
woman 79.4
dance 77
posing 59.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-38
Gender Female, 98.8%
Calm 56.8%
Sad 22%
Happy 17.9%
Confused 1.2%
Surprised 0.9%
Fear 0.5%
Angry 0.4%
Disgusted 0.2%

AWS Rekognition

Age 21-33
Gender Female, 59.3%
Surprised 80.4%
Calm 8.9%
Sad 4%
Happy 3.9%
Confused 1.7%
Fear 0.5%
Angry 0.4%
Disgusted 0.2%

AWS Rekognition

Age 16-28
Gender Male, 66.5%
Calm 39.5%
Happy 31.9%
Surprised 17.4%
Sad 4.5%
Fear 4.3%
Confused 1.1%
Angry 1%
Disgusted 0.2%

AWS Rekognition

Age 23-35
Gender Female, 77.3%
Calm 56.2%
Sad 22%
Happy 11.5%
Surprised 4.3%
Confused 3.4%
Fear 1.1%
Angry 1%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 92.7%

Categories

Imagga

paintings art 95.1%
interior objects 3.5%

Text analysis

Amazon

12

Google

12
12