Human Generated Data

Title

Cafeteria, Brookhaven National Lab

Date

1979

People

Artist: Per Brandin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.232

Copyright

© Per Brandin 1979

Human Generated Data

Title

Cafeteria, Brookhaven National Lab

People

Artist: Per Brandin, American born 1953

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.232

Copyright

© Per Brandin 1979

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Chair 98.2
Furniture 98.2
Clothing 93.5
Apparel 93.5
Female 91.9
Woman 81
Indoors 80.1
Oven 74
Appliance 74
Food 74
Meal 74
Dish 74
Room 65.1
Pottery 64.6
Portrait 61.4
Face 61.4
Photography 61.4
Photo 61.4
Home Decor 59.5
Pot 55.7

Clarifai
created on 2023-10-25

woman 99.5
girl 99.1
people 98.7
portrait 98.6
coffee 98.2
breakfast 98
adult 97.7
dawn 96.9
one 96.1
tea 94.6
cup 94
smile 93.2
young 92.6
cooking 92
beautiful 91.2
family 91.1
model 90.7
retro 90.7
indoors 90.5
lady 89

Imagga
created on 2022-01-08

person 34.8
adult 33
office 31.3
businesswoman 30.9
computer 30.6
attractive 30.1
business 29.8
laptop 29.7
telephone 28.9
happy 28.8
people 27.3
pretty 27.3
smiling 26.8
work 26.7
dial telephone 26.6
home appliance 26.4
iron 25.7
working 24.8
lifestyle 24.6
smile 24.2
desk 23.9
professional 23.6
appliance 23.2
sitting 22.3
gramophone 22.2
portrait 22
job 21.2
electronic equipment 19.7
brunette 19.2
women 19
corporate 18.9
home 18.4
device 18.2
record player 18
lady 17.9
technology 17.8
indoors 17.6
executive 17.5
casual 17
happiness 15.7
room 15.7
sexy 15.3
machine 15.1
table 15.1
cheerful 14.6
durables 14.6
secretary 14.6
success 14.5
looking 14.4
worker 14.3
hair 14.3
one 14.2
model 14
indoor 13.7
cute 13.6
fashion 13.6
businesspeople 13.3
manager 13
phone 12.9
successful 12.8
kitchen 12.7
modern 12.6
holding 12.4
equipment 12.2
20s 11.9
talking 11.4
confident 10.9
communication 10.9
stylish 10.9
suit 10.8
notebook 10.7
disk jockey 10.5
workplace 10.5
friendly 10.1
interior 9.7
domestic 9.6
wireless 9.5
shirt 9.3
clothing 9.2
alone 9.1
pose 9.1
career 8.5
black 8.4
broadcaster 8.4
relaxation 8.4
human 8.3
single 8.2
style 8.2
student 8.2
blond 8.1
look 7.9
face 7.8
consultant 7.8
color 7.8
waiting 7.7
using 7.7
expression 7.7
youth 7.7
twenties 7.6
formal 7.6
sit 7.6
one person 7.5
leisure 7.5
glasses 7.4
relaxing 7.3
consumer goods 7.2
posing 7.1

Google
created on 2022-01-08

Face 98.3
Hairstyle 95.1
Arm 94.4
Tableware 89.2
Plate 88.8
Sleeve 87.2
Dishware 82.6
Waist 80.5
Recipe 78.9
Cooking 78.6
Art 78.1
Food 76.3
Serveware 75
Vintage clothing 74.9
T-shirt 74.8
Monochrome photography 72.6
Monochrome 69.3
Homemaker 69.2
Eyewear 67.6
Fashion design 65.5

Microsoft
created on 2022-01-08

wall 99.4
person 97.6
clothing 94.3
text 94.2
woman 93.4
human face 87.4
smile 72.9
black and white 66.8
posing 45.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 100%
Calm 99.2%
Sad 0.2%
Angry 0.2%
Surprised 0.2%
Confused 0.1%
Fear 0.1%
Happy 0%
Disgusted 0%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 98.2%

Categories

Imagga

paintings art 88.2%
food drinks 10.9%

Captions