Human Generated Data

Title

Untitled (nurse giving boy water therapy)

Date

1947, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.137

Human Generated Data

Title

Untitled (nurse giving boy water therapy)

People

Artist: Jack Gould, American

Date

1947, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 95.9
Person 95.9
Leisure Activities 82.8
Musical Instrument 80.3
Cello 70.3
Musician 68.7
Guitar 63.4
Tub 60.8
Washing 56.4
Floor 55.1

Imagga
created on 2021-12-14

washbasin 59.2
basin 50.3
steel drum 46.6
percussion instrument 37.4
vessel 35.8
musical instrument 28.2
container 27.3
technology 21.5
wok 20.8
pan 20.3
record 19.4
disc 16.5
data 16.4
disk 15.6
drive 15.1
metal 14.5
computer 14.4
object 13.9
digital 13.8
kitchen 13.5
music 13.5
cooking utensil 13.3
sink 13.3
electronics 13.3
close 13.1
equipment 12.2
device 12.2
modern 11.9
information 11.5
storage 11.4
bathroom 11.1
3d 10.8
interior 10.6
read 10.6
business 10.3
silver 9.7
black 9.6
home 9.6
design 9.6
hard 9.5
washstand 9.4
sound 9.4
reflection 8.9
kitchen utensil 8.9
steel 8.8
cooking 8.7
memory 8.7
write 8.5
closeup 8.1
detail 8
shiny 7.9
software 7.8
file 7.7
wash 7.7
hardware 7.7
head 7.6
clean 7.5
three dimensional 7.5
leisure 7.5
electronic 7.5
style 7.4
man 7.4
retro 7.4
entertainment 7.4
cook 7.3
utensil 7.3
open 7.2
machine 7.1
male 7.1
copy 7.1

Microsoft
created on 2021-12-14

indoor 97.7
sink 96.7
text 93.3
home appliance 71
kitchenware 66.3
pan 64.3
black and white 61.7
wok 60
person 58.8
bathroom 56.2
countertop 52.1
kitchen appliance 25.6

Face analysis

Amazon

Google

AWS Rekognition

Age 12-22
Gender Female, 94%
Calm 81.2%
Sad 7.5%
Confused 6%
Happy 1.7%
Angry 1.1%
Disgusted 1.1%
Surprised 0.9%
Fear 0.5%

AWS Rekognition

Age 22-34
Gender Male, 86.5%
Sad 55.2%
Confused 21.5%
Calm 14.6%
Angry 5.3%
Surprised 2.1%
Happy 0.6%
Fear 0.6%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.9%

Captions

Microsoft

a person sitting in a pan on a stove 52.9%
a person sitting on a pan on a stove 51.3%
a close up of a pan on a stove 51.2%