Human Generated Data

Title

Untitled (Columbus, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.363

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Columbus, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.363

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Tripod 100
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Photography 99.5
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Car 84.6
Transportation 84.6
Vehicle 84.6
Machine 82.9
Wheel 82.9
Person 79
Face 77.1
Head 77.1
Clothing 71.6
Hat 71.6
Photographer 66.9
Person 65.1
Person 64.5
Hat 56
Camera 55.4
Electronics 55.4
Video Camera 55.4
City 55.4

Clarifai
created on 2018-05-11

people 99.9
adult 99.3
man 96.9
monochrome 96.3
street 96
group together 94.3
woman 93
wear 92.8
two 90.9
group 90.7
one 87.4
vehicle 87.3
transportation system 84.8
merchant 84.6
several 82.8
four 81.7
furniture 79.9
three 78.7
sit 78.6
administration 77.9

Imagga
created on 2023-10-07

shop 38.7
building 30.3
barbershop 30.2
restaurant 28.6
mercantile establishment 26.4
musical instrument 24.3
percussion instrument 22
steel drum 21.1
city 20.8
people 19.5
man 18.8
place of business 17.5
adult 17.5
structure 15.9
architecture 14
male 13.5
chair 13.3
business 12.7
urban 12.2
person 11.7
couple 11.3
men 11.2
old 11.1
street 11
outdoors 10.4
working 9.7
women 9.5
sitting 9.4
lifestyle 9.4
exterior 9.2
house 9.2
travel 9.1
home 8.8
establishment 8.7
office 8.7
seller 8.5
two 8.5
fashion 8.3
transportation 8.1
family 8
smiling 8
glass 7.9
window 7.7
furniture 7.6
friends 7.5
back 7.3
indoor 7.3
holiday 7.2
interior 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

building 99.9
outdoor 99.6
person 92.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 41-49
Gender Male, 99.5%
Happy 88.7%
Surprised 6.9%
Fear 6.2%
Sad 2.9%
Calm 2.6%
Confused 2.5%
Disgusted 1.2%
Angry 0.7%

AWS Rekognition

Age 29-39
Gender Male, 67.1%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Angry 0%
Disgusted 0%
Confused 0%

Microsoft Cognitive Services

Age 56
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Car 84.6%
Wheel 82.9%
Hat 71.6%

Categories

Text analysis

Amazon

ROOMS
VIRGINIA
RESTAURANT
FURNISHED
DRINK
Coca-Cola
STE
CON
259
C
ECONDMY
CHC
ECONDMY FONTIE LL
261
TOKES
Coca
FONTIE LL

Google

DRINK OMS RESTAURANT
DRINK
OMS
RESTAURANT