Human Generated Data

Title

Untitled (three women and a man eating at a restaurant table)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4557

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women and a man eating at a restaurant table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.4
Person 99.4
Person 99.2
Person 98.8
Person 98.5
Person 98.1
Person 97.9
Restaurant 97.1
Person 95.7
Meal 91.2
Food 91.2
Person 90.2
Person 89.5
Person 88.9
Person 87.8
Person 82.4
Person 80.7
Food Court 73.6
Cafeteria 72.7
Sitting 71.1
Person 68.9
Person 68.6
Dish 65.3
Suit 64.9
Clothing 64.9
Overcoat 64.9
Coat 64.9
Apparel 64.9
Cafe 63.9
Table 63.7
Furniture 63.7

Imagga
created on 2022-02-05

people 30.6
business 28.5
person 28.1
male 27.6
man 27.6
businessman 27.3
brass 23.8
group 23.3
adult 22.8
office 22.3
education 20.8
wind instrument 20.7
men 20.6
job 19.4
work 18.8
professional 18.7
room 18.3
classroom 16.9
teacher 16.4
laptop 15.6
class 15.4
team 15.2
happy 15
businesswoman 14.5
computer 14.4
student 14.3
musical instrument 14.2
meeting 14.1
indoors 14
stage 13.3
businesspeople 13.3
blackboard 12.9
school 12.8
human 12.7
casual 12.7
women 12.6
desk 12.6
suit 12.6
communication 12.6
bartender 12.4
table 12.3
modern 11.9
technology 11.9
indoor 11.9
board 11.7
smiling 11.6
success 11.2
study 11.2
corporate 11.2
teamwork 11.1
cornet 10.7
teaching 10.7
working 10.6
boss 10.5
executive 10.3
black 10.2
horizontal 10
worker 10
sax 9.9
musician 9.9
hand 9.9
entrepreneur 9.8
discussion 9.7
looking 9.6
confident 9.1
holding 9.1
employee 9
counter 8.7
smile 8.5
shop 8.5
portrait 8.4
occupation 8.2
lifestyle 7.9
device 7.9
standing 7.8
students 7.8
colleagues 7.8
music 7.7
youth 7.7
chart 7.6
learning 7.5
chair 7.5
manager 7.4
coffee 7.4
life 7.3
successful 7.3
handsome 7.1
science 7.1
to 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.7
clothing 93.9
person 90.9
man 83.6

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 98.3%
Calm 94.4%
Happy 2.8%
Sad 1%
Confused 0.5%
Disgusted 0.4%
Angry 0.4%
Surprised 0.3%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Female, 55.3%
Calm 96.5%
Happy 2.6%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Sad 0.2%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 45-53
Gender Male, 95.4%
Calm 55.5%
Happy 29.3%
Sad 6.2%
Confused 3.9%
Surprised 1.9%
Disgusted 1.6%
Angry 0.9%
Fear 0.7%

AWS Rekognition

Age 33-41
Gender Male, 99.5%
Calm 89.7%
Sad 5.6%
Angry 1.1%
Happy 0.8%
Surprised 0.8%
Disgusted 0.8%
Fear 0.6%
Confused 0.4%

AWS Rekognition

Age 27-37
Gender Male, 95.5%
Calm 90.7%
Sad 6.3%
Confused 0.8%
Angry 0.6%
Disgusted 0.5%
Surprised 0.5%
Fear 0.3%
Happy 0.2%

AWS Rekognition

Age 48-54
Gender Male, 77.9%
Sad 58.4%
Calm 21%
Happy 8%
Surprised 7.9%
Confused 1.5%
Disgusted 1.3%
Fear 1%
Angry 0.9%

AWS Rekognition

Age 16-22
Gender Female, 86.3%
Calm 96.3%
Sad 2.1%
Angry 0.7%
Happy 0.2%
Confused 0.2%
Surprised 0.2%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.5%
Sad 91.4%
Calm 4.6%
Disgusted 1.3%
Confused 1.1%
Angry 0.6%
Fear 0.4%
Happy 0.4%
Surprised 0.3%

AWS Rekognition

Age 21-29
Gender Female, 78%
Calm 55.7%
Happy 28.1%
Angry 6.5%
Sad 5.1%
Fear 1.8%
Confused 1.4%
Surprised 0.7%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people standing in front of a building 93.3%
a group of people in front of a building 93.2%
a group of people that are standing in front of a building 89.1%

Text analysis

Amazon

NO
POTATOES
FREE
STEAK
AMERICAN
VEGETABLES
BEANS
16295.
16275.
LIMA
MAM
..
PROFANE
OREAN
STEAK CHOW
HOME
POTATOES g
ZOO
16275
BROWN
as
KACON MAM
ЭТАЯТIИ
-
AMERICAN CHEZ
NOT
FOOD STEAK
CLEON
CHICA
ROADS ROATT FREE ME
CHOW
ROATT
MJI7
.
MJI7 ЭТАЯТIИ АЗДА
GETTUCE ZOO
FRIED
BOOK
KACON
S .
1
FOOD
11 BOOK
40
HANSURO
ARIZO CHICA
S
11
g
COMMINADE
AACON
FARNCH FRIED IMCI
FARNCH
ALCAR -
ROADS
АЗДА
HAZH
GETTUCE
LIVERSE
LANGUADE
8003
ME
ALCAR
TONATORS
FREE WITH
IMCI
CHEZ
ARIZO
WITH

Google

16275. NO 90EANE 16275. MJI 3TARTIN AA T 2
NO
16275.
MJI
T
90EANE
2
3TARTIN
AA