Human Generated Data

Title

Untitled (two women dancing in high school talent show)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16721

Human Generated Data

Title

Untitled (two women dancing in high school talent show)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16721

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.7
Human 98.7
Dance Pose 98.2
Leisure Activities 98.2
Dance 97.2
Person 97
Person 94.1
Ballet 89.5
Clothing 87.2
Apparel 87.2
Female 87
Ballerina 81.3
Woman 74.8
Skirt 68.7
Dress 66.9
Flooring 63.1
Stage 55.4

Clarifai
created on 2023-10-29

people 99.8
monochrome 98.1
one 96.2
adult 96
woman 95.3
street 95.2
two 95.2
dancing 95.1
dancer 94.7
group together 93.4
music 92.9
wear 92.7
recreation 91
child 90
man 89.6
girl 88
indoors 84.6
group 83.8
dress 82
portrait 81.7

Imagga
created on 2022-02-26

sword 52.2
weapon 45
people 31.2
dancer 27.2
city 26.6
person 26.6
urban 26.2
business 24.3
men 24
performer 24
man 23.6
silhouette 22.4
adult 21.3
male 21.3
walking 18.9
black 18.7
women 18.2
professional 17.7
group 16.1
motion 15.4
building 15.1
entertainer 15.1
corporate 13.7
suit 13
street 12.9
human 12.7
office 12.2
attractive 11.9
work 11.8
architecture 11.7
crowd 11.5
businessman 11.5
walk 11.4
fashion 11.3
travel 11.3
portrait 11
modern 10.5
window 10.1
reflection 9.9
life 9.8
success 9.7
teacher 9.6
standing 9.6
meeting 9.4
elegance 9.2
pretty 9.1
transportation 9
interior 8.8
body 8.8
happy 8.8
move 8.6
legs 8.5
blur 8.4
transport 8.2
businesswoman 8.2
lady 8.1
active 8.1
team 8.1
gate 7.9
subway 7.9
educator 7.9
train 7.8
airport 7.8
high 7.8
scene 7.8
model 7.8
sport 7.7
dance 7.6
action 7.4
worker 7.4
occupation 7.3
leg 7.3
cute 7.2
passenger 7.1
job 7.1
working 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.3
outdoor 93.4
footwear 90.2
street 88.3
black and white 81.8
clothing 78.7
person 72.5
dress 71.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Calm 86.8%
Surprised 11.4%
Sad 0.7%
Confused 0.4%
Angry 0.4%
Disgusted 0.3%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 19-27
Gender Female, 96.6%
Calm 86.5%
Sad 8.8%
Angry 1.3%
Confused 1%
Happy 0.8%
Fear 0.6%
Surprised 0.6%
Disgusted 0.4%

AWS Rekognition

Age 14-22
Gender Male, 99.2%
Calm 66.8%
Sad 31.3%
Fear 0.4%
Happy 0.4%
Confused 0.4%
Disgusted 0.4%
Angry 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.7%
Person 97%
Person 94.1%

Categories