Human Generated Data

Title

[Outdoor Party on large lawn]

Date

1930's-1940's?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.505.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Outdoor Party on large lawn]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's-1940's?

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Nature 99.9
Outdoors 99.7
Human 99.1
Person 99.1
Person 97.6
Person 97.4
Person 96.9
Person 94.8
Animal 93.9
Bird 93.9
Countryside 93.3
Rural 88.5
Shelter 88.5
Building 88.5
Bird 87.2
Person 87
Snow 86.6
Hut 85.6
Shack 85.6
Storm 80.2
Person 79.2
Winter 77.7
Person 72.6
Bird 65.8
Housing 64.3
House 63.4
Person 60.2

Clarifai
created on 2019-11-19

people 99.9
adult 98.7
home 98.5
group 98.1
man 97.4
group together 96.2
many 92.6
two 91.7
calamity 91.2
several 90.9
child 90.6
woman 88.9
military 88
administration 87.7
war 86.2
one 86
four 85.9
interaction 83.8
street 80.7
soldier 80.7

Imagga
created on 2019-11-19

barbershop 79.2
shop 60.9
mercantile establishment 48.3
place of business 32.2
sand 18.2
old 17.4
beach 16.9
establishment 16.1
travel 14.8
dirty 13.5
sky 13.4
water 13.3
vacation 13.1
vintage 12.4
man 12.1
kin 11.9
architecture 11.8
child 11.2
sea 10.9
ocean 10.8
coast 10.8
holiday 10.7
history 10.7
building 10.6
scene 10.4
newspaper 10.3
house 10
city 10
road 9.9
summer 9.6
grunge 9.4
industrial 9.1
wall 8.7
male 8.6
winter 8.5
black 8.4
people 8.4
texture 8.3
outdoors 8.2
aged 8.1
light 8
person 7.9
product 7.9
art 7.8
ancient 7.8
culture 7.7
tree 7.7
room 7.6
retro 7.4
daily 7.4
street 7.4
decoration 7.3
girls 7.3
color 7.2
lifestyle 7.2
snow 7.2
sunlight 7.1
trees 7.1
family 7.1
brick 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 99.5
text 95.9
house 95
person 78.1
clothing 54.3
old 51.7

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Male, 50.4%
Calm 49.5%
Happy 50.1%
Surprised 49.5%
Fear 49.6%
Disgusted 49.5%
Sad 49.7%
Confused 49.5%
Angry 49.6%

AWS Rekognition

Age 21-33
Gender Female, 50.3%
Sad 49.9%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 49.5%
Fear 49.6%
Happy 49.9%
Angry 49.6%

AWS Rekognition

Age 38-56
Gender Female, 51.2%
Happy 45.5%
Calm 46.5%
Surprised 45.1%
Fear 45.3%
Confused 45.5%
Sad 50.4%
Angry 46.2%
Disgusted 45.5%

AWS Rekognition

Age 20-32
Gender Female, 50.3%
Disgusted 49.5%
Fear 49.5%
Angry 50.2%
Confused 49.5%
Surprised 49.5%
Sad 49.5%
Happy 49.6%
Calm 49.7%

AWS Rekognition

Age 26-40
Gender Female, 50.1%
Happy 49.6%
Surprised 49.5%
Calm 49.9%
Confused 49.6%
Fear 49.5%
Disgusted 49.5%
Sad 49.7%
Angry 49.6%

Feature analysis

Amazon

Person 99.1%
Bird 93.9%

Captions

Microsoft

an old photo of a person 86.6%
a black and white photo of a person 78.4%
a group of people standing in front of a building 78.3%

Text analysis

Google

rrr
rrr