Human Generated Data

Title

Untitled (people on the streets, NYC)

Date

c.1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15895.3

Human Generated Data

Title

Untitled (people on the streets, NYC)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c.1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15895.3

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.3
Human 99.3
Person 99
Person 98.8
Clothing 98.6
Apparel 98.6
Shoe 91
Footwear 91
Door 74.6
Coat 70.5
Clinic 66.2
Shorts 61.8
Indoors 57.7
Sleeve 57.4
Flooring 57
Room 56.7
Shoe 51.9
Shoe 51.1

Clarifai
created on 2023-10-29

people 99.9
monochrome 98.9
wear 97.4
adult 96.5
man 95.5
uniform 94.8
gown (clothing) 93.7
group 92.7
street 91.6
two 90.1
outfit 89.8
outerwear 89.3
medical practitioner 88.1
woman 86.1
group together 84.5
many 83.9
war 83
child 82.8
one 82.2
coat 82.1

Imagga
created on 2022-02-05

barbershop 34.9
shop 32
people 26.2
man 24.2
mercantile establishment 23.3
adult 18.5
person 17.9
male 17.1
place of business 15.5
clothing 15.3
dress 14.5
portrait 14.2
family 14.2
women 13.4
bride 13.4
happiness 13.3
happy 13.2
men 12.9
smile 12.8
newspaper 12.2
couple 12.2
old 11.1
wedding 11
health 10.4
life 10.4
home 10.4
celebration 10.4
love 10.3
nurse 10.2
city 10
hairdresser 9.8
human 9.7
groom 9.4
work 9.4
smiling 9.4
religion 9
business 8.5
senior 8.4
black 8.4
attractive 8.4
hand 8.4
room 8.3
church 8.3
tradition 8.3
fashion 8.3
looking 8
businessman 7.9
medical 7.9
indoors 7.9
product 7.9
together 7.9
face 7.8
patient 7.8
ceremony 7.8
daughter 7.7
establishment 7.7
two 7.6
elegance 7.6
professional 7.5
worker 7.5
religious 7.5
traditional 7.5
future 7.4
mature 7.4
care 7.4
historic 7.3
chair 7.3
clinic 7.2
suit 7.2
romantic 7.1
interior 7.1
travel 7
hospital 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.5
outdoor 95.9
wedding dress 95.4
clothing 93.6
dress 91.3
bride 88.8
person 86.6
black and white 85.1
street 83.6
woman 75.7
dance 60.4
footwear 59.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 97.2%
Calm 97.8%
Sad 2%
Confused 0.1%
Angry 0%
Disgusted 0%
Happy 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 94.2%
Calm 94.3%
Sad 5.2%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%
Surprised 0%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 95.3%
Angry 2.6%
Confused 0.7%
Surprised 0.5%
Sad 0.4%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 90.9%
Calm 54.8%
Sad 27.7%
Happy 8.5%
Disgusted 3.3%
Fear 2.2%
Angry 1.6%
Surprised 1.4%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.3%
Person 99%
Person 98.8%
Shoe 91%
Shoe 51.9%
Shoe 51.1%

Categories

Text analysis

Amazon

WEAR
REPAIRING
ORIS WEAR
RAYONS
CLEANING
ORIS
REMNAN
WOOLENS-COTTO
RAYONS YARD GO
YARD
GO

Google

REMNAN MOOLENS-COTTO RAYONS-YARD GO REPAIRING NIS WEAR CLEANING
REMNAN
MOOLENS-COTTO
RAYONS-YARD
GO
REPAIRING
NIS
WEAR
CLEANING