Human Generated Data

Title

Untitled (three children outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17050

Human Generated Data

Title

Untitled (three children outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 99.8
Clothing 99.8
Human 99.7
Person 99.7
Person 99.3
Person 97.4
Shorts 96.3
Grass 95.8
Plant 95.8
Helmet 94.5
Hand 92.4
Dress 91
Female 89.6
Face 86.3
Baby 85.2
Holding Hands 84.2
Child 82.4
Kid 82.4
Outdoors 81.8
Tree 79.1
Play 77.7
Girl 75.8
Pants 74.8
People 74.7
Photography 69.7
Portrait 69.7
Photo 69.7
Coat 69.5
Nature 66
Woman 64.6
Skirt 59.3
Path 57.5

Imagga
created on 2022-02-26

man 33
people 24.6
beach 23
male 21.5
crutch 21
snow 20.7
person 19.1
sunset 18.9
walking 18
active 18
winter 17.9
sport 17.8
outdoor 17.6
outdoors 17.4
child 16.8
vacation 16.4
cold 16.4
staff 16.3
sky 16
pedestrian 15.4
lifestyle 15.2
silhouette 14.9
sand 14.6
stick 13.9
adult 13.7
sports equipment 13.2
park 13.2
wheeled vehicle 13.1
world 12.6
walk 12.4
couple 12.2
sun 12.1
outside 12
travel 12
sea 11.7
cricket equipment 11.3
barrow 11
summer 10.9
water 10.7
fun 10.5
landscape 10.4
clouds 10.1
tree 10
exercise 10
ocean 10
recreation 9.9
mountain 9.8
together 9.6
boy 9.6
men 9.5
equipment 9.4
handcart 9.4
cricket bat 9.2
leisure 9.1
old 9.1
activity 9
happy 8.8
forest 8.7
love 8.7
adventure 8.5
senior 8.4
vehicle 8.4
shore 8.4
bench 8.1
croquet mallet 8
smiling 8
weather 7.9
happiness 7.8
season 7.8
portrait 7.8
youth 7.7
health 7.6
hill 7.5
air 7.4
life 7.3
ski 7.3
fitness 7.2
women 7.1
kid 7.1
little 7.1
work 7.1
businessman 7.1
day 7.1
professional 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 94.7
clothing 90.3
text 89.3
person 87.3
sport 86.3
athletic game 82.3
black and white 78.4
footwear 76.2

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 95.8%
Happy 32.5%
Calm 30%
Sad 29.9%
Angry 2.1%
Confused 2%
Disgusted 1.3%
Fear 1.1%
Surprised 1.1%

AWS Rekognition

Age 29-39
Gender Male, 84.3%
Sad 84.1%
Calm 8.4%
Happy 3.2%
Fear 1.6%
Disgusted 1%
Angry 0.8%
Confused 0.5%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Helmet 94.5%

Captions

Microsoft

a young boy holding a racket 31.7%
a man and a woman posing for a photo 31.6%
a boy holding a racket 31.5%