Human Generated Data

Title

Untitled (Virgil Thaxton, near Mechanicsburg, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1014

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Virgil Thaxton, near Mechanicsburg, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1014

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Body Part 100
Finger 100
Hand 100
Face 100
Head 100
Photography 100
Portrait 100
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Neck 57.1

Clarifai
created on 2018-05-11

people 99.9
one 99.3
adult 99.1
portrait 98.9
man 98.5
administration 94.4
monochrome 93
wear 88.9
actor 88.1
facial expression 87.3
leader 87
music 85.9
writer 83.4
street 82.1
musician 79.3
sit 77.9
two 77.3
outfit 76.2
war 75.9
profile 75.5

Imagga
created on 2023-10-06

man 47.8
person 40
male 37.7
people 29.6
computer 29.5
laptop 29.3
scholar 29.1
happy 27
adult 25.9
intellectual 24.2
senior 22.5
home 20.8
military uniform 20.6
mature 20.5
sitting 19.8
uniform 19.6
indoors 19.3
smile 19.3
face 19.2
portrait 18.8
grandfather 18.7
looking 18.4
business 18.2
elderly 18.2
smiling 18.1
casual 17.8
office 17.7
clothing 17.3
working 16.8
technology 16.3
handsome 16.1
lifestyle 15.9
work 15.7
businessman 15
couple 14.8
old 14.6
men 14.6
notebook 14.4
glasses 13.9
suit 13.8
communication 12.6
attractive 11.9
indoor 11.9
call 11.7
leisure 11.6
job 11.5
one 11.2
child 11.1
professional 11.1
happiness 11
room 10.9
color 10.6
together 10.5
relaxed 10.3
corporate 10.3
day 10.2
horizontal 10.1
confident 10
aged 10
outdoor 9.9
guy 9.9
retirement 9.6
wireless 9.5
education 9.5
age 9.5
world 9.5
alone 9.1
relaxing 9.1
fun 9
book 8.8
browsing 8.8
look 8.8
retired 8.7
concentration 8.7
love 8.7
engineer 8.7
hand 8.7
covering 8.6
only 8.6
living 8.5
consumer goods 8.5
sit 8.5
two 8.5
pretty 8.4
head 8.4
pensioner 8.4
student 8.4
success 8.1
beard 8
hair 7.9
70s 7.9
living room 7.8
boy 7.8
hands 7.8
black 7.8
seated 7.8
older 7.8
modern 7.7
expression 7.7
sofa 7.7
employee 7.7
husband 7.6
resting 7.6
reading 7.6
desk 7.6
executive 7.5
holding 7.4
focus 7.4
businesswoman 7.3
family 7.1
worker 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.2
man 97
outdoor 88.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-48
Gender Male, 100%
Sad 98.5%
Confused 16%
Calm 15%
Surprised 8.3%
Fear 6.5%
Angry 3.3%
Disgusted 2.7%
Happy 1.6%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%

Categories