Human Generated Data

Title

Untitled (New Orleans, Louisiana?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1489

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New Orleans, Louisiana?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1489

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Dining Table 98.7
Furniture 98.7
Table 98.7
Photography 97.8
Indoors 97
Restaurant 97
Architecture 96.6
Building 96.6
Dining Room 96.6
Room 96.6
Face 93.5
Head 93.5
Food 84.6
Meal 84.6
Dish 82.2
Beverage 79.8
Alcohol 74.2
Portrait 73.8
Clothing 66.9
Footwear 66.9
Shoe 66.9
Plate 65.3
Hat 57.7
Cup 57.3
Cafeteria 57.1
Diner 56.9
Beer 56.6
Smoke 55.7
Drinking 55.2
Cap 55.1

Clarifai
created on 2018-05-11

people 99.6
adult 98.5
one 98
portrait 93
sit 89.4
man 86.9
woman 85.8
wear 85.7
room 84.4
lid 80.5
indoors 79.1
military 79
furniture 76.9
chair 75.4
veil 72.4
facial expression 71.7
recreation 67.9
music 67.8
war 67.4
sitting 67

Imagga
created on 2023-10-06

person 36.1
man 35.6
disk jockey 35.5
male 32
adult 29.3
work 28.3
broadcaster 27.6
people 25.1
communicator 21.8
happy 21.3
working 21.2
computer 20.9
office 20.2
business 20
kitchen 19
smiling 18.1
laptop 17.7
home 17.5
indoors 16.7
cheerful 16.2
looking 16
job 15.9
professional 15.5
lifestyle 15.2
sitting 14.6
technology 14.1
attractive 13.3
smile 12.8
student 12.1
face 12.1
businesswoman 11.8
portrait 11.6
businessman 11.5
desk 11.3
standing 11.3
men 11.2
cook 11
happiness 11
house 10.9
holding 10.7
hand 10.6
musical instrument 10.5
couple 10.4
one 10.4
women 10.3
handsome 9.8
pretty 9.8
worker 9.7
black 9.7
cooking 9.6
casual 9.3
glasses 9.3
clothing 9.2
bartender 9
food 8.8
table 8.8
chef 8.7
education 8.7
corporate 8.6
talking 8.5
two 8.5
shirt 8.4
communication 8.4
occupation 8.2
cup 8
hair 7.9
uniform 7.8
mature 7.4
coffee 7.4
banjo 7.3
guy 7.3
team 7.2
stringed instrument 7.1
employee 7.1
interior 7.1
device 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 97.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 99.8%
Surprised 91.6%
Calm 28.2%
Fear 8.4%
Sad 3.1%
Confused 1.3%
Happy 1.2%
Angry 0.8%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Shoe 66.9%
Plate 65.3%

Categories