Human Generated Data

Title

Untitled (Elizabeth Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2892

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Elizabeth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2892

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.5
Bride 99.5
Female 99.5
Person 99.5
Wedding 99.5
Woman 99.5
Adult 99.2
Person 99.2
Male 99.2
Man 99.2
Face 98.9
Head 98.9
Photography 98.9
Portrait 98.9
Lady 90.6
Clothing 89.9
Dress 89.9
Furniture 76
Formal Wear 73.4
Toy 68.9
Fireplace 57.6
Indoors 57.6
Architecture 57.5
Building 57.5
Hospital 57.5
Door 57.1
Blouse 57
Fashion 56.2
Gown 56.2

Clarifai
created on 2018-05-10

people 100
adult 99.6
portrait 98.8
two 98.7
actress 97.7
leader 96.8
administration 96.7
man 96.4
woman 96.1
sit 95
furniture 94.9
facial expression 94.6
one 94.2
actor 92.3
group 91.9
wear 90.5
seat 89.9
music 89.6
three 89.4
easy chair 88

Imagga
created on 2023-10-06

man 37
person 35.4
male 32.8
bow tie 29.5
people 26.2
necktie 24.4
adult 20.3
professional 16.6
men 16.3
waiter 16
work 15.7
worker 15.5
portrait 15.5
clothing 15.3
garment 14.6
happy 13.8
patient 13.6
business 13.4
businessman 13.2
couple 13.1
grandfather 13
dining-room attendant 12.8
coat 12.5
medical 12.4
doctor 12.2
world 12.1
barbershop 11.8
smile 11.4
face 11.4
fashion 11.3
employee 11.2
old 11.2
smiling 10.9
black 10.8
one 10.5
looking 10.4
love 10.3
occupation 10.1
confident 10
handsome 9.8
groom 9.8
office 9.7
hospital 9.7
serious 9.5
building 9.5
sitting 9.4
happiness 9.4
senior 9.4
health 9
shop 9
indoors 8.8
profession 8.6
room 8.6
suit 8.4
attractive 8.4
father 8.4
scholar 8.3
human 8.2
uniform 8.2
mother 8.2
dress 8.1
lab coat 8
family 8
job 8
hair 7.9
hands 7.8
education 7.8
bride 7.7
child 7.6
study 7.5
holding 7.4
care 7.4
home 7.2
mercantile establishment 7.1
working 7.1
parent 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.6
old 89.5
posing 67.6
older 20

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Female, 99.8%
Angry 43.7%
Calm 35.6%
Confused 15.3%
Surprised 6.6%
Fear 6.2%
Sad 3.3%
Disgusted 0.5%
Happy 0.3%

AWS Rekognition

Age 59-67
Gender Male, 88.2%
Happy 61.1%
Calm 35.8%
Surprised 6.9%
Fear 6.1%
Sad 2.3%
Disgusted 0.5%
Angry 0.3%
Confused 0.3%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 71
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Bride 99.5%
Female 99.5%
Person 99.5%
Woman 99.5%
Male 99.2%
Man 99.2%