Human Generated Data

Title

Untitled (Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1599

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1599

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Photography 100
Face 100
Head 100
Portrait 100
Clothing 100
Shirt 100
Person 99.6
Adult 99.6
Male 99.6
Man 99.6
Electrical Device 97.3
Microphone 97.3
Body Part 95.1
Finger 95.1
Hand 95.1
Smoke 90.7
Astronomy 55.8
Outer Space 55.8
Blouse 55.6
Smoking 55.6

Clarifai
created on 2018-05-11

people 99.9
one 99.2
adult 98.9
portrait 98.7
wear 97.8
man 97.7
outfit 86.8
music 83.4
monochrome 80.1
boy 79.7
administration 77.6
uniform 77.5
musician 75.4
retro 74.3
two 71.1
profile 71.1
woman 69.3
leader 68.1
facial expression 66.2
veil 63.4

Imagga
created on 2023-10-06

pay-phone 100
telephone 89
electronic equipment 67.2
equipment 39.8
man 34.2
male 27.6
people 24
portrait 23.9
person 23.5
black 22.4
adult 21.3
call 19
face 18.5
one 17.9
expression 17.9
guy 16.6
attractive 15.4
human 15
eyes 14.6
looking 14.4
model 14
hair 13.5
serious 13.3
suit 12.9
businessman 12.3
business 12.1
hand 11.4
hands 11.3
sexy 11.2
handsome 10.7
fashion 10.5
look 10.5
body 10.4
men 10.3
dark 10
lifestyle 9.4
hope 8.7
work 8.6
emotion 8.3
alone 8.2
style 8.2
posing 8
smile 7.8
modern 7.7
pretty 7.7
casual 7.6
professional 7.6
tie 7.6
head 7.6
strong 7.5
computer 7.2
fitness 7.2
macho 7.1
love 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.1
man 90.4
standing 81
posing 62.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 55.1%
Fear 97.9%
Surprised 6.3%
Sad 2.2%
Calm 1.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Adult 99.6%
Male 99.6%
Man 99.6%

Categories

Imagga

cars vehicles 95.8%
paintings art 2.7%