Human Generated Data

Title

Untitled (Elizabeth Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2829

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Elizabeth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2829

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99
Human 99
Person 98.9
Face 97
Clothing 92.4
Apparel 92.4
Hair 67.8
Undershirt 65.7
Portrait 63.9
Photo 63.9
Photography 63.9
Finger 61
Female 58.5
Man 57.1

Clarifai
created on 2018-03-23

people 99.9
portrait 99.2
adult 99.2
woman 96.9
man 96.9
one 96.4
two 96.3
wear 93.3
administration 91.1
group 90.6
facial expression 90.5
music 87
monochrome 85.4
actor 84.6
actress 82.8
leader 82.2
retro 81.8
three 81.4
war 80.8
veil 77.6

Imagga
created on 2018-03-23

man 22.2
person 21.9
portrait 20.7
currency 20.7
money 20.4
statue 18.7
grandma 17.4
art 17
old 16.7
dollar 16.7
cash 16.5
male 16.3
sculpture 15.9
one 15.7
ancient 15.6
face 14.9
crazy 14.9
people 14
banking 13.8
culture 13.7
adult 13.6
ruler 13.5
religion 13.5
bank 13.5
bill 13.3
god 12.4
senior 12.2
expression 12
stone 11.8
finance 11
architecture 10.9
close 10.9
history 10.7
comedian 10.7
performer 10.6
religious 10.3
hair 10.3
head 10.1
financial 9.8
banknote 9.7
us 9.6
golden 9.5
closeup 9.4
paper 9.4
church 9.3
world 9.2
business 9.1
wealth 9
hundred 8.7
temple 8.7
dollars 8.7
emotion 8.3
vintage 8.3
human 8.3
mask 8.1
body 8
franklin 7.9
entertainer 7.9
banknotes 7.8
black 7.8
antique 7.8
model 7.8
marble 7.8
finances 7.7
payment 7.7
pay 7.7
hand 7.6
historical 7.5
kin 7.5
famous 7.5
mature 7.4
economy 7.4
historic 7.3
mother 7.3
looking 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 93.6%
Happy 39.9%
Confused 14.9%
Angry 5%
Disgusted 4.3%
Sad 8%
Calm 25.4%
Surprised 2.5%

AWS Rekognition

Age 35-52
Gender Female, 99.1%
Sad 40.9%
Disgusted 1.3%
Surprised 1.1%
Calm 39.9%
Angry 3.5%
Happy 10.4%
Confused 3%

Microsoft Cognitive Services

Age 66
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Azure OpenAI

Created on 2024-01-27

The image displays two individuals seated next to each other. They both appear to be wearing patterned attire; the individual on the right seems to be dressed in a polka dot patterned outfit, while the one on the left is in a lighter-colored garment with a v-neck design. Both are seated against a backdrop with text and images, suggesting they may be in front of a poster or a painted wall. The photograph has an aged, monochromatic look, indicative of being taken in an earlier era. The environment looks worn and weathered, contributing to a vintage feel.

Anthropic Claude

Created on 2024-03-29

The image depicts two women standing in front of a building. They appear to be wearing traditional dresses or outfits, with one woman wearing a patterned polka dot dress and the other in a plainer shirt. Both women have stern expressions on their faces as they gaze directly at the camera. The photograph has a grainy, black and white aesthetic, suggesting it was taken some time ago. The setting appears to be a rural or small-town environment, with the wooden building behind the women providing a simple backdrop.

Text analysis

Amazon

TH
IR