Human Generated Data

Title

Untitled (Columbus, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2214

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Columbus, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2214

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Photography 100
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Face 99.1
Head 99.1
Portrait 99.1
Clothing 98.3
Hat 94.5
Person 93.9
Person 92.3
Shop 88
Person 85.4
Person 85.2
Body Part 78.2
Finger 78.2
Hand 78.2
Person 76.6
Newsstand 65.9
Person 60.1
Tripod 57.2
Cap 56.7
Worker 56.6
Advertisement 56.3
Baseball Cap 56.1
Poster 56
Sun Hat 55.5

Clarifai
created on 2018-05-10

people 99.9
adult 98.4
man 97.4
one 96
wear 95.7
two 93.2
veil 90.8
monochrome 88.1
group together 87.4
administration 83.8
vehicle 83.8
actor 81.9
recreation 81.3
portrait 81.3
three 80.1
street 79
group 78.3
dig 77.6
baseball 77.5
transportation system 77.5

Imagga
created on 2023-10-07

stall 62.9
seller 32.9
shop 30.9
man 20.8
people 19.5
person 19.3
industrial 19.1
work 18.8
industry 18.8
building 18.6
mercantile establishment 18.6
business 17.6
job 15.9
male 15.6
adult 14.9
bookshop 14.6
construction 14.5
transportation 14.3
house 14.2
men 13.7
transport 13.7
architecture 13.3
equipment 13.2
home 12.8
one 12.7
worker 12.6
old 12.5
city 12.5
place of business 12.3
occupation 11.9
vehicle 11.7
sky 11.5
smile 11.4
looking 11.2
safety 11
sea 10.9
happy 10.6
newspaper 10.3
sale 10.2
structure 10.1
engineering 9.5
lifestyle 9.4
street 9.2
sign 9
outdoors 9
landscape 8.9
tobacco shop 8.9
helmet 8.8
space 8.5
portrait 8.4
protection 8.2
technology 8.2
road 8.1
water 8
working 8
urban 7.9
day 7.8
hardhat 7.8
attractive 7.7
professional 7.7
ship 7.6
customer 7.6
casual 7.6
marine 7.6
horizontal 7.5
light 7.3
product 7.3
indoor 7.3
new 7.3
builder 7.3
black 7.2
travel 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

man 98.3
person 97.9
outdoor 97.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Calm 86.2%
Surprised 6.5%
Sad 6.3%
Fear 6%
Happy 2%
Confused 1.5%
Disgusted 0.8%
Angry 0.6%

AWS Rekognition

Age 10-18
Gender Female, 76.3%
Happy 59.7%
Calm 15.6%
Fear 15.1%
Surprised 6.9%
Sad 3.6%
Confused 2.1%
Disgusted 1.2%
Angry 0.8%

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Happy 30.1%
Calm 29%
Surprised 20.4%
Fear 9%
Angry 8.5%
Sad 4.1%
Confused 3.8%
Disgusted 1.5%

AWS Rekognition

Age 10-18
Gender Female, 62%
Sad 76.6%
Calm 43.2%
Fear 11.6%
Surprised 7.4%
Happy 2.8%
Confused 2.2%
Disgusted 1.9%
Angry 1.8%

AWS Rekognition

Age 18-26
Gender Male, 98.2%
Calm 78.6%
Surprised 8.3%
Fear 7.6%
Angry 5.8%
Sad 3.3%
Confused 2.2%
Happy 1.6%
Disgusted 1.1%

Microsoft Cognitive Services

Age 57
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Hat 94.5%

Captions

Text analysis

Amazon

EAT
DRINK
RESTAURANT
NEW
S
TOYES
Si
ZUCO
25
the