Human Generated Data

Title

Untitled (children making sandwiches in kitchen, baby in high chair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16377

Human Generated Data

Title

Untitled (children making sandwiches in kitchen, baby in high chair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 98.7
Human 98.7
Person 97.7
Person 97
Person 75.5
Meal 75.2
Food 75.2
Indoors 73.3
Room 73.3
Clinic 69.6
Building 68.5
People 60.3
Dish 57.2

Imagga
created on 2022-02-11

table 39.3
glass 35.1
interior 29.2
restaurant 26
dining 22.8
setting 22.1
dinner 21.4
luxury 21.4
dishwasher 20.2
plate 18.6
fork 18.2
party 18
room 17.7
decor 17.7
home 17.6
banquet 16.8
decoration 16.6
elegant 16.3
white goods 16
kitchen 15.9
drink 15.9
furniture 15.6
perfume 14.9
service 14.8
glasses 14.8
wedding 14.7
food 14.1
lunch 13.9
silverware 13.9
wine 13.8
reception 13.7
napkin 13.7
appliance 13.6
home appliance 13.6
knife 13.4
house 13.4
container 13.1
celebration 12.8
tablecloth 12.7
chair 12.5
arrangement 12.5
toiletry 12.2
flowers 12.2
place 12.1
event 12
modern 11.9
catering 11.7
hotel 11.5
fine 11.4
design 11.2
cutlery 10.7
laboratory 10.6
medical 10.6
formal 10.5
research 10.5
case 10.4
tray 10.2
meal 10.2
equipment 10
science 9.8
experiment 9.7
spoon 9.5
indoor 9.1
people 8.9
candle 8.9
silver 8.8
indoors 8.8
counter 8.7
chemistry 8.7
lifestyle 8.7
sink 8.6
set 8.5
elegance 8.4
dish 8.4
style 8.2
romance 8
water 8
empty 8
cooking 7.9
oven 7.8
biotechnology 7.8
dine 7.8
serve 7.8
lab 7.8
person 7.7
scientific 7.7
chemical 7.7
test 7.7
biology 7.6
eat 7.5
contemporary 7.5
clean 7.5
receptacle 7.5
china 7.4
buffet 7.4
tableware 7.3
stove 7.2
romantic 7.1
man 7.1
architecture 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 97.8
black and white 77.6
food 67
person 60.5
table 51.2

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Female, 68.8%
Calm 99.4%
Sad 0.3%
Happy 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 7-17
Gender Female, 99.6%
Calm 99.5%
Happy 0.2%
Sad 0.1%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 6-14
Gender Female, 95.5%
Calm 74.1%
Surprised 19.4%
Sad 5.2%
Angry 0.4%
Confused 0.3%
Happy 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 10-18
Gender Male, 99.8%
Surprised 79.4%
Happy 16%
Calm 2.8%
Angry 0.8%
Disgusted 0.3%
Fear 0.3%
Sad 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a group of people sitting in front of a window 57.2%
a group of people sitting at a table in front of a window 57.1%
a group of people standing in front of a window 57%

Text analysis

Amazon

26
SKIPPY
EIA--

Google

26
26