Human Generated Data

Title

[Store window with reflection]

Date

1950s?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Store window with reflection]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950s?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.5
Human 99.5
Vehicle 84.5
Transportation 84.5
Automobile 84.5
Car 84.5
Face 79.5
Person 75.7
Person 71.6
Aircraft 68.9
Airplane 68.9
Car 65.4
Photography 62.2
Portrait 62.2
Photo 62.2
Apparel 55.5
Clothing 55.5

Clarifai
created on 2019-05-29

people 99.9
adult 98.8
group together 98.7
group 96.8
monochrome 95.5
wear 93.9
two 93.8
vehicle 93.7
man 93.4
three 93.1
one 92.6
administration 90.5
woman 90.4
transportation system 89.9
several 89.4
many 88.1
watercraft 87
military 86.8
four 86.5
outfit 85.4

Imagga
created on 2019-05-29

bartender 84.6
building 19.3
sky 17.2
man 15.5
industry 14.5
city 14.1
person 13.9
people 13.9
male 13.8
industrial 13.6
architecture 12.5
work 11.8
smoke 11.2
water 10.7
seller 10.3
business 10.3
men 10.3
island 10.1
adult 9.7
outdoors 9.7
landscape 9.7
construction 9.4
equipment 9.4
sea 9.4
winter 9.4
worker 9.2
urban 8.7
snow 8.7
culture 8.5
buildings 8.5
travel 8.4
power 8.4
old 8.4
ocean 8.3
exterior 8.3
sun 8
holiday 7.9
black 7.8
boat 7.7
park 7.6
house 7.5
one 7.5
tourism 7.4
environment 7.4
history 7.1
smile 7.1
steel 7.1
working 7.1
day 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 94.9
man 93.2
black and white 91.4
clothing 81.9
human face 72.5
black 67.8
old 62.2
white 62.1
posing 39.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 93.4%
Angry 3.7%
Calm 10.6%
Confused 4.2%
Disgusted 9.6%
Happy 5.1%
Sad 55.5%
Surprised 11.3%

Feature analysis

Amazon

Person 99.5%
Car 84.5%
Airplane 68.9%

Categories

Text analysis

Amazon

OLD
OLD COLD
COLD