Human Generated Data

Title

Untitled (Ben Shahn taking a photograph, Asia)

Date

January 14, 1960-April 22, 1960

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1998.140

Human Generated Data

Title

Untitled (Ben Shahn taking a photograph, Asia)

People

Artist: Unidentified Artist,

Date

January 14, 1960-April 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1998.140

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Poster 97.2
Advertisement 97.2
Text 88.4
Person 84.9
Human 84.9
Art 71.9
Leisure Activities 66.6
Collage 58.4
Canvas 55.8

Clarifai
created on 2023-10-25

people 98.7
retro 96.6
man 96.5
art 96.3
vintage 96.1
one 95.7
portrait 93.5
wear 93.2
adult 92.9
street 92.7
painting 92.6
collage 92.2
music 91.4
monochrome 89.2
old 88.7
two 85.9
nostalgia 83.8
dirty 81.4
woman 80.5
antique 80.1

Imagga
created on 2021-12-15

newspaper 85
product 72.6
creation 56.3
money 44.2
currency 35
dollar 33.4
cash 32.9
business 32.2
finance 30.4
bank 26
dollars 24.1
banking 22.1
bill 21.9
wealth 21.5
paper 21.2
hundred 20.3
banknote 19.4
savings 18.6
musical instrument 18.3
accordion 17.6
book 17.2
financial 16.9
investment 16.5
exchange 15.3
rich 14.9
bills 14.6
payment 14.4
keyboard instrument 14.3
us 13.5
man 13.4
buy 13.1
economy 13
success 12.9
paying 12.6
person 12.5
pay 12.5
hands 12.2
hand 12.2
market 11.5
black 11.4
wind instrument 11.3
note 11
holding 10.7
debt 10.6
male 9.9
buying 9.6
office 9.6
notes 9.6
loan 9.6
stock 9.4
old 9.1
one 9
businessman 8.8
finances 8.7
change 8.7
envelope 8.5
shopping 8.4
sale 8.3
close 8
scholar 7.9
twenty 7.9
salary 7.9
banknotes 7.8
gambling 7.8
pound 7.7
trade 7.7
commerce 7.5
number 7.5
vintage 7.4
intellectual 7.3
adult 7.2
concepts 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

handwriting 98.5
text 98.5
art 77.1
book 76.8
poster 76.6
music 53

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-51
Gender Female, 80.2%
Calm 80.7%
Sad 6.7%
Happy 5.4%
Angry 2.3%
Disgusted 1.8%
Surprised 1.4%
Fear 1.1%
Confused 0.6%

Feature analysis

Amazon

Poster 97.2%
Person 84.9%

Categories

Imagga

paintings art 99.2%

Captions

Text analysis

Amazon

JAPAN
ME
ABOUT
TELL ME ABOUT
TELL

Google

ME APOUT JAPAN
APOUT
ME
JAPAN