Human Generated Data

Title

Untitled (girl standing outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17558

Human Generated Data

Title

Untitled (girl standing outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.8
Human 99.8
Apparel 99.8
Clothing 99.8
Female 99.2
Woman 96.3
Skirt 95.1
Dress 90.2
Shorts 56.3
Girl 55.8

Imagga
created on 2022-02-26

man 31.1
person 26.6
sky 19.9
people 19.5
sport 18.8
tourist 16.9
leisure 16.6
male 16.5
summer 15.4
outdoor 15.3
outdoors 15.2
happy 14.4
adult 14.3
happiness 14.1
attractive 14
freedom 13.7
clothing 12.9
clouds 12.7
silhouette 12.4
park 12.4
lifestyle 12.3
life 11.9
women 11.9
world 11.7
active 11.7
beach 11
traveler 10.9
exercise 10.9
activity 10.8
businessman 10.6
hiking 10.6
success 10.5
standing 10.4
light 10.4
cleaner 10.2
groom 10.2
casual 10.2
relaxation 10.1
travel 9.9
fun 9.7
business 9.7
sexy 9.6
athlete 9.6
rock 9.6
adventure 9.5
water 9.3
black 9.2
vacation 9
sunset 9
mountain 9
landscape 8.9
posing 8.9
couple 8.7
sea 8.6
sitting 8.6
men 8.6
walk 8.6
model 8.6
portrait 8.4
joy 8.4
health 8.3
fashion 8.3
danger 8.2
style 8.2
fitness 8.1
snow 8.1
sun 8.1
cool 8
hair 7.9
holiday 7.9
day 7.9
cloud 7.8
walking 7.6
career 7.6
human 7.5
one 7.5
action 7.4
against 7.4
cheerful 7.3
cute 7.2
love 7.1
season 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.7
clothing 95
athletic game 94.7
outdoor 91.7
sport 90.6
person 85.5
woman 83.1
black and white 66.9
water 61.8
footwear 55

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 94.8%
Calm 75.3%
Confused 6.4%
Fear 5.8%
Happy 5.8%
Sad 2.8%
Disgusted 1.6%
Angry 1.1%
Surprised 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a person standing in front of a building 86.2%
a person standing in front of a baseball game 39.8%
a person standing in front of a baseball field 39.7%

Text analysis

Amazon

a
MJ17
MJ17 YT3RAS ACTHA
YT3RAS
ACTHA

Google

YT3RA
MJI7
MJI7 YT3RA