Human Generated Data

Title

Untitled (women dressed up as Native Americans posing in field)

Date

1953

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2261

Human Generated Data

Title

Untitled (women dressed up as Native Americans posing in field)

People

Artist: Harry Annas, American 1897 - 1980

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Dance Pose 99.8
Leisure Activities 99.8
Human 99.5
Person 99.5
Person 98.7
Person 98.4
Person 96
Person 92.1
Dance 92
Stage 78.9
Ballet 77.5
Person 73.8
People 66.7
Ballerina 58.4
Apparel 58
Shorts 58
Clothing 58

Imagga
created on 2022-01-30

athlete 100
runner 100
contestant 79
person 49.2
people 35.1
silhouette 33.9
group 29.8
male 24.1
men 24
sport 23.4
crowd 20.2
adult 19.4
sunset 18.9
body 18.4
women 18.2
man 18.1
black 18
active 18
lifestyle 17.3
beach 16.4
fun 15.7
boy 14.8
sky 14.7
dance 14.6
run 14.5
happy 14.4
art 14.3
summer 14.1
friends 14.1
outdoor 13
teenager 12.8
girls 12.8
outdoors 12.7
team 12.5
friendship 12.2
action 12.1
pose 11.8
together 11.4
walking 11.4
couple 11.3
teamwork 11.1
sea 10.9
leisure 10.8
grass 10.3
competition 10.1
joy 10
javelin 10
exercise 10
fitness 9.9
vacation 9.8
human 9.7
clothing 9.6
athletic 9.6
graphic 9.5
play 9.5
females 9.5
party 9.5
youth 9.4
child 9.3
field 9.2
sand 9.2
freedom 9.1
fashion 9
copy space 9
sun 8.9
sexy 8.8
silhouettes 8.7
boys 8.7
running 8.6
outline 8.5
attractive 8.4
health 8.3
children 8.2
style 8.2
lady 8.1
activity 8.1
success 8
family 8
spear 8
happiness 7.8
dancer 7.8
model 7.8
dancing 7.7
jump 7.7
energy 7.6
evening 7.5
back 7.5
fit 7.4
business 7.3
figure 7.2
dress 7.2
recreation 7.2
swimsuit 7.1
portrait 7.1
posing 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

grass 99.8
outdoor 99.2
dance 97.2
text 96.6
person 87.8
black 66.3
white 61.6
clothing 55.3
posing 52.8
old 44.2

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 98.1%
Sad 74.7%
Fear 16.4%
Angry 2.2%
Disgusted 2.1%
Happy 2%
Confused 1.5%
Surprised 0.6%
Calm 0.5%

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Sad 72.6%
Fear 12.9%
Happy 9%
Surprised 2.7%
Angry 1.2%
Calm 0.6%
Disgusted 0.5%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a group of people posing for a picture 89.1%
a vintage photo of a group of people posing for the camera 87.2%
a group of people posing for a photo 87.1%

Text analysis

Amazon

are
YE3A
YE3A NAGOR
NAGOR

Google

YT3RA°2 -A
-A
YT3RA°2