Human Generated Data

Title

Untitled (hospital patients in wheelchairs)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16009.1

Human Generated Data

Title

Untitled (hospital patients in wheelchairs)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.7
Human 99.7
Wheel 98.5
Machine 98.5
Furniture 97.6
Chair 95.9
Person 95.9
Person 95.5
Person 95.1
Wheel 95
Person 92.2
Bike 91.8
Vehicle 91.8
Bicycle 91.8
Transportation 91.8
Person 91.1
Bicycle 89.9
Person 85
Sitting 74.9
People 68.4
Person 67.9
Wheel 63.7
Floor 63.7
Couch 62.8
Text 59.9
Person 59.9
Wheel 59.7
Porch 56.8
Indoors 56
Room 56
Electronics 55.9
Display 55.9
Monitor 55.9
Screen 55.9
Door 55.3
LCD Screen 55.2
Person 42.2

Imagga
created on 2022-02-11

sketch 29.2
house 28.4
room 25.9
drawing 23.8
architecture 23.7
window 22.9
home 21.5
barbershop 21.2
building 19.6
shop 18.8
interior 18.6
representation 17.9
old 16.7
wood 15.8
chair 14.9
modern 14.7
light 14.7
design 14.6
door 14.5
mercantile establishment 14.2
city 14.1
structure 13.8
indoors 13.2
wall 12.9
inside 12.9
travel 12.7
newspaper 12.3
sliding door 11.4
floor 11.1
table 10.7
history 10.7
furniture 10.7
ancient 10.4
indoor 10
hospital 9.9
patient 9.9
place of business 9.5
glass 9.3
product 9.1
urban 8.7
windows 8.6
empty 8.6
estate 8.5
vintage 8.3
decoration 8.2
style 8.2
paint 8.1
bedroom 8.1
locker 7.9
luxury 7.7
grunge 7.7
real 7.6
elegance 7.5
brick 7.5
town 7.4
barrier 7.4
business 7.3
decor 7.1
antique 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 99.8
table 86.7
black and white 85.8
chair 83
furniture 81.6
house 75.7

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 94.8%
Surprised 57.3%
Calm 30.4%
Disgusted 3.3%
Fear 3.2%
Angry 2.3%
Happy 1.8%
Sad 1.5%
Confused 0.3%

AWS Rekognition

Age 16-22
Gender Female, 94.8%
Happy 93.2%
Calm 4.2%
Fear 0.7%
Sad 0.7%
Surprised 0.5%
Angry 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 98.5%
Bicycle 91.8%

Captions

Microsoft

a group of people in front of a window 47.7%
a group of people in a room 47.6%
a group of people next to a window 44.1%

Text analysis

Amazon

KODAK
FILM
SAFETY FILM
SAFETY
8
S

Google

S'AFETY
FILM
KODAK
KODAK S'AFETY FILM KODAK