Human Generated Data

Title

Untitled (man teaching girls a dance)

Date

c. 1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15694.1

Human Generated Data

Title

Untitled (man teaching girls a dance)

People

Artist: Jack Gould, American

Date

c. 1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.6
Human 99.6
Person 99.3
Person 99.3
Person 99
Person 99
Person 98.7
Person 98.5
Clothing 96.2
Apparel 96.2
Person 90.7
Dance Pose 90.1
Leisure Activities 90.1
Person 89.3
Person 89.2
Female 77.2
People 71.6
Dance 59.8
Sports 59
Sport 59
Woman 57.7
Skating 57
Ice Skating 56
Fashion 55.5
Gown 55.5

Imagga
created on 2022-02-05

people 33.4
group 25
adult 23.9
person 23.8
women 21.3
fashion 20.3
shoe shop 19.2
men 18.9
man 18.8
shop 17.9
portrait 17.5
dress 17.2
happy 16.9
clothing 16.9
sketch 16.9
body 15.2
human 15
team 14.3
drawing 14.2
male 13.5
attractive 13.3
teacher 12.9
indoor 12.8
indoors 12.3
smiling 12.3
lifestyle 12.3
urban 12.2
professional 12.2
lady 12.2
elegant 12
mercantile establishment 11.9
city 11.6
sport 11.6
silhouette 11.6
interior 11.5
sexy 11.2
pretty 11.2
business 10.9
model 10.9
dancer 10.8
crowd 10.6
walking 10.4
style 10.4
luxury 10.3
motion 10.3
happiness 10.2
brassiere 10.1
exercise 10
active 9.9
garment 9.8
modern 9.8
educator 9.8
representation 9.6
life 9.5
smile 9.3
copy space 9
activity 9
posing 8.9
window 8.7
standing 8.7
move 8.6
sitting 8.6
legs 8.5
casual 8.5
joy 8.3
undergarment 8.3
performer 8.3
healthy 8.2
sensual 8.2
place of business 8.1
cheerful 8.1
suit 8.1
covering 8
home 8
businessman 7.9
hair 7.9
design 7.9
work 7.8
black 7.8
train 7.7
building 7.6
elegance 7.5
vintage 7.4
woman's clothing 7.4
world 7.3
sensuality 7.3
make 7.3
pose 7.2
figure 7.2
fitness 7.2
dance 7.2
looking 7.2
consumer goods 7.1

Google
created on 2022-02-05

Active shorts 93
Gesture 85.3
Vintage clothing 69.9
Monochrome 68.9
Monochrome photography 68.5
Art 67.8
Font 66.5
Illustration 65.1
Crew 61.6
Paper product 60.8
Uniform 60.1
Visual arts 57.7
Team 56.1
Knee 55.2
Recreation 53.1
Room 51.3
Drawing 50.9

Microsoft
created on 2022-02-05

text 98.7
footwear 93.7
clothing 93.1
person 91.5
old 90.8
dance 90.5
group 88.5
woman 80.9
people 64.4
posing 62.6
white 60.5

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 57.8%
Calm 59.9%
Sad 22.5%
Happy 8.5%
Disgusted 2.6%
Angry 2.2%
Confused 1.7%
Fear 1.5%
Surprised 1.1%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 56%
Sad 26%
Confused 13.7%
Surprised 1.2%
Happy 1.2%
Angry 0.9%
Disgusted 0.7%
Fear 0.2%

AWS Rekognition

Age 25-35
Gender Female, 72%
Confused 34.8%
Sad 30.9%
Happy 14.9%
Calm 13.3%
Surprised 1.7%
Disgusted 1.6%
Angry 1.4%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 86.9%
a vintage photo of a group of people posing for a picture 86.8%
a group of people posing for a photo 86.7%