Human Generated Data

Title

Untitled (Judith and Ezra Shahn, New York City)

Date

April 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.192

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Judith and Ezra Shahn, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

April 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.192

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.3
Human 99.3
Shop 87.1
Person 86.2
Clothing 71.6
Apparel 71.6
Meal 69
Food 69
Long Sleeve 65.1
Sleeve 65.1
Text 64.7
Overcoat 56.6
Coat 56.6
Window Display 55.8

Clarifai
created on 2023-10-15

people 99.8
street 97.9
monochrome 97.8
adult 97.5
man 97
one 95.5
portrait 94.3
music 93.8
two 93
woman 90.5
wear 88.3
child 87.2
furniture 85.9
group 85.2
art 85.2
three 84.4
musician 83.7
room 82.9
boy 82.5
instrument 82.1

Imagga
created on 2021-12-15

freight car 48.3
car 41.6
percussion instrument 35.9
steel drum 35.3
musical instrument 33.3
wheeled vehicle 29.2
vehicle 19.4
architecture 16.6
building 16.4
business 16.4
man 15.4
technology 14.8
work 14.1
computer 13.8
light 13.4
modern 12.6
city 12.5
construction 12
laptop 11.8
office 11.8
people 11.7
job 11.5
working 11.5
industry 11.1
worker 10.7
digital 10.5
metal 10.5
device 10.4
window 10.3
barbershop 9.9
old 9.7
conveyance 9.7
urban 9.6
render 9.5
shop 9.5
equipment 9.4
fire 9.4
silhouette 9.1
structure 9.1
machine 9
black 9
steel 8.8
men 8.6
finance 8.4
industrial 8.2
businessman 7.9
male 7.8
labor 7.8
3d 7.7
sky 7.6
hand 7.6
power 7.5
person 7.4
safety 7.4
room 7.3
music 7.2
interior 7.1
travel 7
indoors 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.1
outdoor 88.6
black and white 75.3
hat 62.2
cartoon 56.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Female, 87.6%
Happy 76.6%
Calm 19.7%
Sad 2%
Surprised 0.5%
Confused 0.3%
Disgusted 0.3%
Angry 0.3%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Text analysis

Amazon

UNITED
RN
STATES
GRAMS
UNITED STATES LI
ON
APR
APR 2
LI
2
IRELAND
..
MANNATT
-
- and
EMAIL
and
EMAIL 44
Bur
UNIO Bur
44
UNIO
Camaro

Google

MANNATT APR 2 UNITED STATES LIN RN IRELAND ENAN EGRAMS
MANNATT
APR
2
UNITED
STATES
LIN
RN
IRELAND
ENAN
EGRAMS