Human Generated Data

Title

Untitled (street artist in beret with onlookers and well dressed woman in hat)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15817

Human Generated Data

Title

Untitled (street artist in beret with onlookers and well dressed woman in hat)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15817

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Clothing 99.3
Apparel 99.3
Person 98.7
Person 98
Person 96.4
Person 95.9
Overcoat 82.5
Coat 82.5
Person 82.1
Tripod 78.7
Wood 68.8
Hat 61.2
Portrait 61
Photography 61
Face 61
Photo 61
Shoe 60.6
Footwear 60.6
Suit 55.1
Shoe 53

Clarifai
created on 2023-10-29

people 99.8
group 98.2
man 97.7
adult 97.3
group together 96
administration 95.9
many 93.3
leader 93
veil 91.3
monochrome 90.3
two 90.2
street 88.7
several 88.5
three 86.7
military 86.4
war 85.2
wear 83.9
four 83.9
music 83.3
elderly 82.8

Imagga
created on 2022-02-05

man 24.2
old 20.2
male 19.1
engineer 18.3
business 16.4
people 15.1
person 14.5
center 14.3
computer 14
industry 13.7
equipment 13.4
working 13.2
office 13.2
black 13.2
television 13
building 13
room 12.2
power 11.7
shop 11.5
urban 11.4
barbershop 11.3
men 11.2
architecture 10.9
adult 10.4
device 10
city 10
art 9.8
steel 9.7
metal 9.6
safety 9.2
vintage 9.1
history 8.9
work 8.6
construction 8.5
window 8.4
mercantile establishment 8.4
server 8.3
historic 8.2
protection 8.2
industrial 8.2
technology 8.2
interior 8
businessman 7.9
musical instrument 7.7
monitor 7.5
one 7.5
retro 7.4
light 7.3
case 7.3
aged 7.2
information 7.1
job 7.1
machine 7
electronic equipment 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

clothing 98.4
person 98.2
man 97.5
text 92.9
standing 90.1
black and white 88.6
outdoor 87.9
black 82.1
concert 59.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 57.3%
Sad 28%
Angry 3.9%
Confused 3.7%
Happy 2.6%
Surprised 1.8%
Disgusted 1.6%
Fear 1%

AWS Rekognition

Age 25-35
Gender Male, 77.8%
Calm 99.7%
Sad 0.2%
Confused 0.1%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 98.7%
Person 98%
Person 96.4%
Person 95.9%
Person 82.1%
Shoe 60.6%
Shoe 53%

Categories

Text analysis

Amazon

BRARY
SALE
FOR
WOMRATH'S
BOOK
asmetics
FH'S
107

Google

WOMRATHS BRARY TH'S BOOK FOR SALE
WOMRATHS
BRARY
TH'S
BOOK
FOR
SALE