Human Generated Data

Title

Untitled (girl standing in front of church steps)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17329

Human Generated Data

Title

Untitled (girl standing in front of church steps)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 99.9
Clothing 99.9
Person 99.5
Human 99.5
Person 99.5
Person 99.4
Person 99.3
Person 99
Person 97.6
Shorts 96.4
Female 94.8
Dress 87
Flooring 86.7
Footwear 84.7
Shoe 84.7
Woman 81
Person 80.4
Skirt 73.2
Sleeve 72.3
Shoe 70.1
Handrail 68.5
Banister 68.5
Girl 64
Floor 56.5
Shoe 53

Imagga
created on 2022-02-26

man 25.5
people 25.1
male 20.7
city 18.3
adult 17.7
chair 17.6
person 17.4
building 15.4
business 15.2
men 13.7
barrier 13.1
black 12.7
leisure 12.4
support 12.1
active 11.7
device 11.6
run 11.6
lifestyle 11.6
outdoor 11.5
walking 11.4
urban 11.4
competition 11
seat 10.8
pretty 10.5
athlete 10.4
sport 10.2
action 10.2
suit 10.2
street 10.1
obstruction 9.8
architecture 9.8
attractive 9.8
walk 9.5
women 9.5
work 9.4
exercise 9.1
health 9
structure 9
outdoors 9
life 8.9
professional 8.8
indoors 8.8
boy 8.7
sitting 8.6
barber chair 8.5
summer 8.4
fashion 8.3
fit 8.3
speed 8.2
fun 8.2
alone 8.2
one 8.2
playing 8.2
fitness 8.1
lady 8.1
worker 8.1
standing 7.8
portrait 7.8
travel 7.7
outside 7.7
old 7.7
step 7.6
energy 7.6
floor 7.4
holding 7.4
furniture 7.4
success 7.2
businessman 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 96.3
footwear 93.3
clothing 74.7
street 72.3
black 72
monochrome 71.8
person 60

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 98.8%
Calm 99.9%
Surprised 0%
Disgusted 0%
Happy 0%
Sad 0%
Fear 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 20-28
Gender Male, 57.7%
Calm 88.5%
Sad 10.1%
Happy 0.4%
Surprised 0.4%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Female, 86.7%
Sad 38.2%
Calm 30%
Fear 21.9%
Confused 3.1%
Surprised 2.8%
Happy 1.8%
Angry 1.3%
Disgusted 0.8%

AWS Rekognition

Age 23-33
Gender Male, 98%
Sad 58%
Confused 18%
Calm 14.6%
Disgusted 3.6%
Happy 2%
Angry 1.4%
Surprised 1.2%
Fear 1.1%

AWS Rekognition

Age 42-50
Gender Male, 99.3%
Confused 30%
Sad 22.1%
Calm 22.1%
Disgusted 10.7%
Happy 9.4%
Fear 2.6%
Angry 2%
Surprised 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 84.7%

Captions

Microsoft

a person sitting on a bench 50%
a person standing in front of a building 49.9%
a person sitting on a bench in front of a building 35%