Human Generated Data

Title

Untitled (Plain City, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2048

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Plain City, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2048

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Person 99.5
Face 88.6
Head 88.6
Architecture 62.2
Building 62.2
Factory 62.2
Machine 61.9
Body Part 57.1
Finger 57.1
Hand 57.1
Manufacturing 56.5
Sewing 55.2

Clarifai
created on 2018-05-10

people 99.9
adult 99.1
man 97.2
one 97.1
monochrome 97.1
two 94.5
woman 92.3
war 91.4
child 91
group 90.7
furniture 90.4
room 90.3
wear 90.1
sit 89.5
concentration 87.4
administration 85.3
indoors 85
medical practitioner 83.5
uniform 81
group together 80.6

Imagga
created on 2023-10-05

man 38.3
person 28.7
people 25.7
male 24.8
work 22.8
barbershop 21.3
working 19.4
worker 18.3
shop 18.2
job 16.8
adult 16.4
occupation 14.7
men 14.6
bartender 14.6
industry 14.5
skill 14.4
professional 14.1
mercantile establishment 13.6
equipment 13.5
repair 12.4
sitting 12
device 12
home 12
profession 11.5
indoors 11.4
hand 11.4
tool 11.4
face 11.4
looking 11.2
construction 11.1
instrument 10.9
black 10.8
labor 10.7
office 10.7
one 10.4
portrait 10.3
mature 10.2
safety 10.1
industrial 10
smile 10
kitchen 9.8
lifestyle 9.4
business 9.1
place of business 9.1
coat 8.9
craftsman 8.9
to 8.8
metal 8.8
look 8.8
movie 8.7
sit 8.5
gun 8.5
senior 8.4
glasses 8.3
medical 7.9
scientist 7.8
manual 7.8
product 7.8
table 7.8
factory 7.7
exam 7.7
craft 7.6
workplace 7.6
research 7.6
building 7.5
hospital 7.5
doctor 7.5
technology 7.4
weapon 7.4
room 7.4
phone 7.4
inside 7.4
indoor 7.3
protection 7.3
smiling 7.2
school 7.2
workshop 7.1
interior 7.1

Microsoft
created on 2018-05-10

person 99.4
man 93.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 54-64
Gender Male, 98.3%
Calm 65.2%
Confused 14.8%
Sad 13.8%
Surprised 6.7%
Fear 6.1%
Angry 3.5%
Happy 0.9%
Disgusted 0.4%

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 98.3%
pets animals 1.5%

Captions

Text analysis

Amazon

TO
CASH
TO AL
AL
-
- -
THE
can -
- Serving
all
all -
.
BER
can
Serving

Google

CA TO A
CA
TO
A