Human Generated Data

Title

Untitled (child on crutches with nurse)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15559.1

Human Generated Data

Title

Untitled (child on crutches with nurse)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Apparel 99.9
Clothing 99.9
Person 99.6
Human 99.6
Shorts 99.6
Person 98.5
Female 91.1
Shoe 87.3
Footwear 87.3
Flooring 87
Skirt 81.9
Floor 80.3
Woman 78.2
Sleeve 69.3
Portrait 61.4
Photography 61.4
Face 61.4
Photo 61.4
Shoe 58.8
Chair 57.3
Furniture 57.3

Imagga
created on 2022-02-05

crutch 80.1
staff 61.9
stick 48.9
shopping cart 34.4
handcart 27.8
man 26.2
people 23.4
cleaner 23
male 22.7
city 20.8
wheeled vehicle 20.5
business 19.4
happy 19.4
adult 19
urban 18.3
outdoors 17.9
person 17.5
men 16.3
building 16.2
container 15.5
walking 15.1
walk 14.3
smile 14.3
women 14.2
job 14.2
standing 13.9
swab 13.5
smiling 13
office 12.9
day 12.6
businessman 12.4
cleaning implement 11.8
suit 11.8
modern 11.2
attractive 11.2
window 11
bag 10.4
architecture 10.2
street 10.1
lifestyle 10.1
active 10.1
worker 9.8
full length 9.7
life 9.6
corporate 9.5
work 9.4
senior 9.4
casual 9.3
shopping 9.3
silhouette 9.1
pretty 9.1
fashion 9
group 8.9
luggage 8.8
happiness 8.6
travel 8.4
basket 8.4
holding 8.3
fun 8.2
one 8.2
cheerful 8.1
chair 8
interior 8
couple 7.8
airport 7.8
net 7.7
move 7.7
old 7.7
adults 7.6
one person 7.5
joy 7.5
executive 7.4
professional 7.3
indoor 7.3
cute 7.2
recreation 7.2
hall 7.2
working 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

clothing 96.7
footwear 96.5
person 95.7
black and white 87.6
standing 86.1
man 85.5
text 74.9

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Calm 95.8%
Surprised 3.5%
Happy 0.2%
Disgusted 0.2%
Confused 0.1%
Sad 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 36-44
Gender Male, 69.1%
Calm 79.5%
Sad 18.2%
Confused 0.8%
Disgusted 0.8%
Angry 0.3%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 87.3%

Captions

Microsoft

a person standing in front of a building 83%
a person standing next to a building 82%
a man and a woman standing in front of a building 69.9%