Human Generated Data

Title

Untitled (four people wearing formal clothes emerging from building covered by foliage)

Date

1935-1945

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9030

Human Generated Data

Title

Untitled (four people wearing formal clothes emerging from building covered by foliage)

People

Artist: Martin Schweig, American 20th century

Date

1935-1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.5
Person 99.5
Dress 98.1
Apparel 98.1
Clothing 98.1
Person 96.8
Street 86.4
Building 86.4
Town 86.4
Road 86.4
Urban 86.4
City 86.4
Person 85.7
Female 81.3
Overcoat 79.1
Suit 79.1
Coat 79.1
Path 74.9
Door 74.9
People 70.5
Person 68.8
Walkway 68.6
Costume 66.4
Woman 64.7
Robe 63.3
Fashion 63.3
Text 62
Photo 61.4
Photography 61.4
Gown 57.8
Indoors 57.6
Alleyway 57.5
Alley 57.5
Face 56.7

Imagga
created on 2022-02-05

pay-phone 100
telephone 89.1
electronic equipment 67.8
equipment 38.8
people 24
call 23.7
person 21.8
fashion 21.1
portrait 20.7
attractive 18.2
man 17.5
lady 17
adult 16.2
sexy 16.1
street 15.6
dress 15.4
posing 15.1
wall 14.5
black 14.5
style 14.1
sensuality 13.6
male 13.5
human 13.5
hair 13.5
pretty 12.6
legs 12.3
cute 12.2
business 12.1
suit 11.9
city 11.6
lifestyle 11.6
standing 11.3
building 11.2
model 10.9
urban 10.5
body 10.4
women 10.3
happy 10
outdoor 9.9
corporate 9.5
men 9.4
elegant 9.4
dark 9.2
old 9.1
lovely 8.9
couple 8.7
high 8.7
face 8.5
elegance 8.4
outdoors 8.2
office 7.9
work 7.8
summer 7.7
expression 7.7
youth 7.7
fashionable 7.6
makeup 7.3
umbrella 7.3
alone 7.3
success 7.2
smile 7.1
businessman 7.1
travel 7
growth 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

clothing 92.3
outdoor 89.4
person 83.9
footwear 83.2
black and white 76.3
woman 74.9
text 73.3

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 93.6%
Calm 79%
Happy 6%
Surprised 5.6%
Sad 4.2%
Confused 1.4%
Disgusted 1.4%
Fear 1.3%
Angry 1.1%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people standing in front of a window 82.3%
a person standing in front of a window 82.2%
a group of people walking in front of a window 82.1%

Text analysis

Amazon

a
STARTIN
MJIR
MJIR STARTIN ARDA
ARDA

Google

3TARTIM
MJI3
ARJA
MJI3 3TARTIM ARJA