Human Generated Data

Title

Untitled (passersby watching street portrait artist "Amo" at work, NYC)

Date

1940s

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15898.3

Human Generated Data

Title

Untitled (passersby watching street portrait artist "Amo" at work, NYC)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15898.3

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.6
Human 98.6
Person 98.4
Person 98.3
Clothing 97.7
Apparel 97.7
Person 92.6
Poster 88.7
Advertisement 88.7
Text 75.5
People 74.4
Person 71.3
Person 70
Face 69.9
Coat 69.3
Person 65.5
Overcoat 65.2
Flyer 62.4
Brochure 62.4
Paper 62.4
Suit 61.7
Photography 60.5
Photo 60.5
Collage 55.3

Clarifai
created on 2023-10-29

people 100
monochrome 99.7
adult 99.1
man 98.4
group 98.4
woman 98
group together 97
wear 94.6
many 94.5
actress 92.6
administration 91.4
street 90.4
child 89.7
music 87.6
actor 86.8
leader 86.3
war 86.1
recreation 86
several 85.6
musician 85.6

Imagga
created on 2022-02-05

newspaper 37.3
daily 35.5
product 29.2
shop 28.1
old 23
creation 22.8
barbershop 20.2
mercantile establishment 20.1
vintage 19.8
negative 17.8
currency 17
art 16.3
film 16.2
money 15.3
building 15.3
antique 14.7
cash 14.6
book jacket 14.1
architecture 14.1
ancient 13.8
history 13.4
paper 13.4
place of business 13.4
bill 13.3
dollar 13
historic 12.8
people 12.8
business 12.8
finance 12.7
bank 12.5
man 12.1
retro 11.5
jacket 11
city 10.8
photographic paper 10.3
wall 10.3
decoration 10.2
economy 10.2
banking 10.1
aged 9.9
religion 9.9
financial 9.8
room 9.7
sculpture 9.7
window 9.5
grunge 9.4
house 9.2
interior 8.8
symbol 8.7
hundred 8.7
culture 8.5
historical 8.5
monument 8.4
savings 8.4
wrapping 8.3
wealth 8.1
home 8
texture 7.6
tourism 7.4
church 7.4
design 7.3
black 7.2
portrait 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
person 92.6
clothing 91.5
outdoor 90.1
woman 69.9
posing 48.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 79.7%
Calm 98%
Surprised 1%
Fear 0.6%
Sad 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%
Confused 0%

AWS Rekognition

Age 18-26
Gender Female, 50.1%
Calm 63.8%
Happy 18.5%
Sad 5.5%
Confused 4.9%
Surprised 2.7%
Disgusted 2.4%
Fear 1.5%
Angry 0.7%

AWS Rekognition

Age 37-45
Gender Male, 86.6%
Calm 82.8%
Surprised 5.4%
Disgusted 4.1%
Happy 2.1%
Sad 2.1%
Angry 1.8%
Confused 0.9%
Fear 0.8%

AWS Rekognition

Age 45-51
Gender Female, 70.6%
Calm 99.2%
Sad 0.6%
Confused 0.1%
Angry 0.1%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 74.5%
Happy 60.5%
Calm 19.7%
Fear 12.9%
Sad 2.9%
Surprised 1.6%
Confused 1%
Disgusted 1%
Angry 0.5%

AWS Rekognition

Age 23-31
Gender Female, 63.4%
Surprised 50.4%
Calm 43.9%
Angry 2.4%
Confused 0.9%
Fear 0.8%
Disgusted 0.7%
Sad 0.5%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person
Poster
Person 98.6%
Person 98.4%
Person 98.3%
Person 92.6%
Person 71.3%
Person 70%
Person 65.5%
Poster 88.7%

Categories

Imagga

paintings art 98.9%

Text analysis

Amazon

OR
PORTRAIT
YOUR PORTRAIT
YOUR
OR CARICATURE
HAND
CARICATURE
BY
LAUNDRY
BEST
PROFILE
EST
MACDOUGAL
MACDOUGAL HAND PARK
BEST GLUEST
EST 1999
PARK
GLUEST
Amo'
in
1999
VILLAGE
PROFILE I-
NATURE
ruis
I-
CAR NATURE JU
CAR
Card ruis
LIF
Card
JU

Google

HACDOUBAL PARK HAND LAUNDRY FILLASE EST 1 YOUR PORTRAIT OR CARICATURE Amo BY PROFILE I- CARICATES"
HACDOUBAL
PARK
HAND
LAUNDRY
FILLASE
EST
1
YOUR
PORTRAIT
OR
CARICATURE
Amo
BY
PROFILE
I-
CARICATES"