Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1647

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1647

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Photography 100
Face 100
Head 100
Portrait 100
Body Part 99.9
Finger 99.9
Hand 99.9
Person 99.7
Adult 99.7
Male 99.7
Man 99.7
Architecture 90.7
Building 90.7
Outdoors 90.7
Shelter 90.7
Text 61.1
Clothing 56.7
Shirt 56.7
Coat 56.6
Jacket 56.6
Plant 56.5
Tree 56.5
Electronics 55.9

Clarifai
created on 2018-05-11

people 99.7
adult 97.4
man 97.3
administration 95.8
one 95.2
monochrome 91.6
portrait 91.1
leader 90.6
war 86.5
street 81.4
actor 78.9
election 77.3
offense 75.8
music 74.9
sit 71.7
military 71.4
home 70.9
candidate 70.7
vehicle 68.8
dig 68.3

Imagga
created on 2023-10-06

billboard 29.1
man 25.5
signboard 23.2
male 21.3
structure 20.9
person 20.4
business 20
building 17.2
office 16.6
call 16.6
old 15.3
suit 15.2
adult 14.9
people 14.5
architecture 12.7
city 12.5
businessman 12.4
window 12
shop 11.4
men 11.2
working 10.6
one 10.4
looking 10.4
professional 10.3
wall 10.3
work 10.2
sky 10.2
portrait 9.7
manager 9.3
house 9.2
equipment 9.1
sign 9
success 8.8
urban 8.7
standing 8.7
ancient 8.6
corporate 8.6
barbershop 8.5
casual 8.5
black 8.4
mercantile establishment 8.3
street 8.3
confident 8.2
guy 8
job 8
lifestyle 7.9
outside 7.7
executive 7.6
happy 7.5
outdoors 7.5
town 7.4
worker 7.3
metal 7.2
aged 7.2
home 7.2
face 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98.9
person 98.1
man 97.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Male, 100%
Angry 38.7%
Happy 38.6%
Calm 9.7%
Disgusted 7.3%
Surprised 6.9%
Fear 6.2%
Sad 2.9%
Confused 1.4%

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Adult 99.7%
Male 99.7%
Man 99.7%

Categories

Text analysis

Amazon

UNIVE
ME
V
UE
MISSIO
presents

Google

UE DMISSI UNI Fesen ME
UE
DMISSI
UNI
Fesen
ME