Human Generated Data

Title

Untitled (Dr. Herman M. Juergens seated at table talking with patient)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.492.4

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens seated at table talking with patient)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.492.4

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 99.9
Human 99.9
Person 99
Wood 88.7
Plywood 88.7
Apparel 82.5
Clothing 82.5
Helmet 67.2
Home Decor 55.9

Clarifai
created on 2019-08-09

people 99.9
one 99.4
adult 99.3
two 98.3
furniture 98.2
man 97.8
room 96
woman 92.6
wear 92.2
group 90
sit 89.8
administration 87.3
indoors 87.2
concentration 86.9
leader 86.3
three 85.9
scientist 85.6
war 83.8
desk 83.8
portrait 83.4

Imagga
created on 2019-08-09

person 28.8
man 24.8
people 23.4
patient 19
adult 17.4
male 16.4
home 15.9
room 14.7
indoors 14
men 13.7
case 13.1
lifestyle 13
sick person 12.4
sitting 12
worker 11.7
couch 11.6
black 10.8
shop 10.7
sax 10.6
human 10.5
barbershop 10.4
brass 9.9
wind instrument 9.8
device 9.4
work 9.4
snow 9.2
house 9.2
holding 9.1
portrait 9.1
job 8.8
crutch 8.7
women 8.7
light 8.7
happiness 8.6
business 8.5
professional 8.4
chair 8.4
relaxation 8.4
equipment 8.3
mask 8.2
happy 8.1
life 8
interior 8
businessman 7.9
love 7.9
play 7.7
building 7.7
modern 7.7
attractive 7.7
winter 7.7
casual 7.6
reading 7.6
hand 7.6
fashion 7.5
inside 7.4
musical instrument 7.3
table 7.1

Google
created on 2019-08-09

Photograph 96.2
Snapshot 86.9
Black-and-white 80.7
Photography 74.8
Room 74.4
Sitting 66.1
Monochrome 64.3
Furniture 53.3
Child 52

Microsoft
created on 2019-08-09

person 95.6
clothing 92.5
black and white 90.6
indoor 88
table 86.4
text 85.2
furniture 74.9
man 74.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 52.7%
Confused 45%
Angry 45.1%
Sad 45.5%
Surprised 45.2%
Happy 45.5%
Disgusted 45%
Calm 49.3%
Fear 49.4%

AWS Rekognition

Age 39-57
Gender Male, 54.4%
Angry 45.3%
Sad 48.4%
Surprised 45.1%
Fear 45.2%
Calm 50.8%
Disgusted 45.1%
Confused 45.1%
Happy 45.1%

Feature analysis

Amazon

Person 99.9%
Helmet 67.2%

Categories

Captions

Microsoft
created on 2019-08-09

a man sitting on a counter 72.7%
a man sitting on a table 64.1%
a man sitting at a table 64%