Human Generated Data

Title

Untitled (white women dressed in Indian costumes)

Date

1953, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.199

Human Generated Data

Title

Untitled (white women dressed in Indian costumes)

People

Artist: Harry Annas, American 1897 - 1980

Date

1953, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.199

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.5
Apparel 99.5
Grass 99.4
Plant 99.4
Person 98.7
Human 98.7
Person 98.6
Person 97.9
Person 96.4
Person 96.3
Person 95
Shorts 93.1
People 84.4
Female 83.7
Outdoors 76.5
Swimwear 69.7
Leisure Activities 64.9
Lawn 63.6
Skin 61.5
Bikini 61.3
Portrait 61
Face 61
Photography 61
Photo 61
Girl 60.4
Woman 60.1
Costume 58.2
Helmet 58.2
Crowd 57.7

Clarifai
created on 2023-10-27

people 99.5
group together 98.5
wear 97.8
group 97.2
adult 94.4
man 91.8
girl 90.3
monochrome 89.2
art 88.9
many 88.8
woman 88.7
outfit 88.6
recreation 88.3
several 87.4
child 85.9
dancer 84.3
veil 81.6
fun 79.7
music 79.7
field 77.9

Imagga
created on 2022-01-23

football helmet 32.6
sport 31.7
silhouette 30.6
helmet 30.2
sunset 28.8
people 27.9
grass 24.5
sky 24.3
outdoors 24
active 22.6
man 22.2
outdoor 21.4
summer 20.6
headdress 19.7
fun 19.5
ball 19.4
male 19.2
field 18.4
brass 17.7
clothing 17.5
horse 17.2
men 17.2
action 16.7
leisure 16.6
boy 16.5
activity 16.1
bugle 16
lifestyle 15.9
happy 15.7
sun 15.3
child 14.9
athlete 14.6
run 14.5
clouds 14.4
adult 14.3
joy 14.2
black 13.8
playing 13.7
beach 13.6
wind instrument 13.5
football 13.5
team 13.4
family 13.3
happiness 13.3
player 13.2
sports 12.9
play 12.9
competition 12.8
freedom 12.8
person 12.5
group 12.1
landscape 11.9
children 11.8
fitness 11.7
game 11.6
running 11.5
girls 10.9
soccer 10.6
couple 10.5
device 10.3
evening 10.3
outside 10.3
two 10.2
recreation 9.9
kick 9.8
together 9.6
athletic 9.6
friends 9.4
youth 9.4
musical instrument 9.3
travel 9.2
exercise 9.1
vacation 9
meadow 9
animal 9
sunlight 8.9
kid 8.9
horses 8.8
boys 8.7
women 8.7
love 8.7
dusk 8.6
enjoying 8.5
back 8.5
free 8.4
ranch 8.3
covering 8.1
harness 8
smiling 8
sea 7.8
life 7.8
consumer goods 7.8
practice 7.7
foot 7.6
energy 7.6
friendship 7.5
teenager 7.3
mountain 7.1

Google
created on 2022-01-23

Vertebrate 92.2
Sky 88.4
Cloud 87.4
Gesture 85.3
People in nature 84.1
Happy 80.1
Adaptation 79.4
Recreation 71.2
Monochrome photography 71.1
Art 69.9
Event 68.1
Photo caption 67.2
Crew 63
Stock photography 62.9
Team sport 61.8
Team 56.9
Grass 53.9
Sports 53.8
Fun 51.3
Monochrome 51.1

Microsoft
created on 2022-01-23

grass 100
outdoor 98.4
text 90.5
cartoon 80.9
person 77.7
group 75.2
clothing 75.2
posing 67.4
old 60.3
man 53.4
dance 50.5
clothes 18.7
horse 13.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 97.7%
Disgusted 87.9%
Happy 3.2%
Sad 3.1%
Surprised 2.7%
Confused 1.6%
Angry 0.9%
Fear 0.4%
Calm 0.3%

AWS Rekognition

Age 19-27
Gender Female, 84.1%
Happy 93.8%
Surprised 1.5%
Fear 1.3%
Angry 1.1%
Sad 1%
Calm 0.6%
Confused 0.4%
Disgusted 0.4%

AWS Rekognition

Age 18-24
Gender Female, 98.5%
Happy 87%
Sad 5.7%
Angry 1.8%
Surprised 1.6%
Confused 1.1%
Fear 1.1%
Calm 0.9%
Disgusted 0.7%

AWS Rekognition

Age 22-30
Gender Female, 88.3%
Sad 48.6%
Angry 21.4%
Happy 11.9%
Fear 6.2%
Calm 4.1%
Disgusted 3.2%
Surprised 2.5%
Confused 2.1%

AWS Rekognition

Age 25-35
Gender Female, 91.2%
Happy 97.1%
Calm 1.2%
Fear 0.5%
Sad 0.4%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 24-34
Gender Female, 60.6%
Happy 33.8%
Fear 24.3%
Sad 22.5%
Confused 8.2%
Surprised 3.5%
Angry 3.3%
Disgusted 2.5%
Calm 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%