Human Generated Data

Title

Untitled (people on beds in large room)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16006.2

Human Generated Data

Title

Untitled (people on beds in large room)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.5
Human 99.5
Person 98.1
Person 97.9
Furniture 96.3
Restaurant 90.5
Person 83.2
Person 81.3
Text 80.5
Cafeteria 76.9
Chair 63
Indoors 61.3
Room 58.1
Cafe 57.6
Table 56.6
Person 55.8
Person 42.8

Imagga
created on 2022-02-11

interior 66.4
room 59.4
hospital 54.6
furniture 42.1
table 41.5
chair 41.2
dishwasher 38.3
modern 37.9
home 36
kitchen 35.2
house 34.3
decor 33.6
floor 31.6
white goods 31.2
design 29.3
indoor 28.3
wood 27.5
indoors 26.4
dining 25.7
apartment 24.9
home appliance 24.8
restaurant 24.6
appliance 23.6
architecture 22.7
luxury 21.5
window 21.1
contemporary 19.8
inside 19.3
stove 18.9
decoration 18.8
style 18.6
comfortable 18.2
glass 17.9
3d 17.1
counter 17
cabinet 16.9
wall 16.5
classroom 16.4
empty 16.3
light 16.1
cafeteria 15.6
domestic 15.5
residential 15.3
lamp 15.3
nobody 14.8
oven 14.7
seat 14.5
building 14.4
faucet 13.8
chairs 13.7
dinner 13.5
food 13.3
wooden 13.2
stool 12.8
sink 12.8
plant 12.7
elegance 12.6
structure 12.5
steel 12.4
metal 12.1
tables 11.8
stylish 11.8
tile 11.7
lifestyle 11.6
hotel 11.5
render 11.3
drink 10.9
vase 10.6
bar 10.2
cook 10.1
cooking 9.6
office 9.6
hall 9.3
service 9.3
relaxation 9.2
clean 9.2
stainless 8.7
lighting 8.7
day 8.6
estate 8.6
relax 8.4
refrigerator 8.3
durables 8
drawer 7.9
urban 7.9
people 7.8
luxurious 7.8
scene 7.8
sofa 7.7
area 7.7
expensive 7.7
rendering 7.6
bed 7.6
clinic 7.6
nurse 7.4
new 7.3

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 99.3
furniture 93.9
table 90.4
chair 69.4
white 68.8
black 65.4
building 63.7
black and white 63.6
old 52.7
vintage 26

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 92.8%
Calm 95.9%
Sad 1.6%
Fear 0.9%
Confused 0.4%
Happy 0.4%
Disgusted 0.3%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 20-28
Gender Female, 50.9%
Sad 88.8%
Calm 3.9%
Fear 2.7%
Happy 1.7%
Angry 0.9%
Confused 0.8%
Surprised 0.7%
Disgusted 0.7%

AWS Rekognition

Age 16-24
Gender Female, 72.5%
Calm 72.8%
Confused 10.5%
Sad 9.2%
Happy 4.2%
Fear 1.1%
Angry 1%
Disgusted 0.7%
Surprised 0.6%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of some people in a room 91.3%
a vintage photo of a group of people in a room 88.7%
a vintage photo of some people 78.8%

Text analysis

Amazon

KODAK
FILM
KODAK SAFETY FILM
SAFETY
I
C-3

Google

KODAK
KODAK SAFETY FILM KODAK
SAFETY
FILM