Human Generated Data

Title

Untitled (firemen eating at table, served by women)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15111

Human Generated Data

Title

Untitled (firemen eating at table, served by women)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15111

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.4
Person 99
Person 95.6
Person 94.8
Tabletop 93.5
Furniture 93.5
Beverage 91.1
Drink 91.1
Table 89.1
Alcohol 88.3
Bottle 87.1
Clothing 85.2
Apparel 85.2
Room 80.7
Indoors 80.7
Dining Table 77
Wine 75.9
Hat 69.8
Glass 68.7
Suit 63.4
Coat 63.4
Overcoat 63.4
Portrait 61.4
Photography 61.4
Photo 61.4
Face 61.4
Chess 60.2
Game 60.2
Dining Room 57.5
Bar Counter 57.2
Pub 57.2

Clarifai
created on 2023-10-29

people 99.3
indoors 96.5
man 95.1
woman 94.9
monochrome 94
adult 93.2
group 92.4
child 90.1
room 84.3
education 83.7
furniture 82.9
chess 82.5
three 82.3
board game 80.4
war 78.5
family 77
science 76.8
group together 72
two 69.7
scientist 68.4

Imagga
created on 2022-03-05

man 29.5
people 29
office 24.2
computer 24.1
professional 23.3
case 23.3
indoors 22.8
male 21.3
business 21.2
working 21.2
work 21.2
person 20.5
adult 19.5
businessman 19.4
room 19
technology 18.5
medical 17.6
newspaper 17.3
worker 17.1
home 16.7
laptop 16.6
monitor 16.4
interior 15.9
hospital 15.6
patient 15.5
shop 15.2
doctor 15
clinic 15
health 14.6
kitchen 14.5
product 14.3
lifestyle 13.7
looking 13.6
equipment 13.2
senior 13.1
smiling 13
modern 12.6
house 12.5
job 12.4
medicine 12.3
desk 12.3
men 12
team 11.6
screen 11.4
businesspeople 11.4
corporate 11.2
portrait 11
occupation 11
communication 10.9
businesswoman 10.9
horizontal 10.9
creation 10.7
one 10.4
coat 10.4
indoor 10
nurse 9.7
illness 9.5
keyboard 9.4
casual 9.3
teamwork 9.3
barbershop 9.3
executive 9.2
table 9
specialist 8.9
happy 8.8
laboratory 8.7
sitting 8.6
furniture 8.5
meeting 8.5
mercantile establishment 8.5
back 8.3
holding 8.2
care 8.2
alone 8.2
student 8.1
science 8
women 7.9
sterile 7.9
day 7.8
education 7.8
lab 7.8
30s 7.7
old 7.7
elderly 7.7
one person 7.5
dishwasher 7.5
cheerful 7.3
group 7.2
machine 7.2
bright 7.1
lab coat 7.1
face 7.1
information 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.6
person 90.2
clothing 78.7
black and white 54.7
vase 52.6
woman 52.3
table 51.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 51.9%
Calm 78.9%
Sad 17.2%
Confused 1.7%
Angry 0.8%
Happy 0.7%
Surprised 0.3%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Sad 90.3%
Calm 3.7%
Happy 3.1%
Confused 1.2%
Angry 0.9%
Disgusted 0.4%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 53-61
Gender Male, 98.8%
Sad 78.4%
Calm 16.5%
Confused 2.1%
Surprised 1.6%
Angry 0.8%
Disgusted 0.4%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 37-45
Gender Female, 99.7%
Happy 88.2%
Calm 7.8%
Sad 1.1%
Surprised 0.8%
Fear 0.7%
Confused 0.5%
Angry 0.5%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Hat
Chess
Person 99.7%
Person 99.4%
Person 99%
Person 95.6%
Person 94.8%
Hat 69.8%
Chess 60.2%

Categories

Imagga

interior objects 87.3%
paintings art 10.5%

Text analysis

Amazon

SAFETY
K
DAK
M
K o DAK SAFETY FILM
FIL M
FIL
o
FILM

Google

FILM KODAK S'A EEIY E!!M
FILM
KODAK
S'A
EEIY
E!!M