Human Generated Data

Title

[Group at look-out point]

Date

1950s?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.331.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Group at look-out point]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950s?

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.3
Human 99.3
Person 97.3
Person 96.9
Person 95.7
Person 95.2
Nature 94.6
Person 91.7
Outdoors 84
Person 79.5
Tree 76.1
Plant 76.1
Mammal 74.9
Horse 74.9
Animal 74.9
Person 69.1
People 63.5
Bus Stop 59
Person 51.6

Clarifai
created on 2019-05-29

people 99.8
group together 99
adult 97.5
vehicle 96.9
man 96.3
group 96.1
street 95.9
military 93.9
road 93.8
transportation system 93.4
war 91.7
soldier 89.9
many 89.2
administration 85
monochrome 84.6
two 83.5
calamity 82
wear 81.6
woman 80.4
fog 79.3

Imagga
created on 2019-05-29

motor vehicle 100
snowplow 100
wheeled vehicle 53.2
vehicle 40.8
snow 32.6
winter 28.9
landscape 27.5
road 27.1
cold 23.2
sky 21.7
travel 17.6
industry 17.1
machine 16.3
car 15.6
transportation 15.2
trees 15.1
rural 15
tree 14.8
outdoors 14.2
construction 13.7
industrial 13.6
snowy 13.6
tractor 13.2
scenic 13.2
scene 13
ice 12.9
ship 12.2
truck 11.9
work 11.8
machinery 11.7
frost 11.5
country 11.4
old 11.1
mountains 11.1
cannon 11
scenery 10.8
river 10.7
mountain 10.7
pine 10.5
season 10.1
transport 10
weather 9.9
cloud 9.5
high-angle gun 9.5
water 9.3
power 9.2
outdoor 9.2
city 9.1
freeze 8.7
rock 8.7
frozen 8.6
heavy 8.6
outside 8.6
drive 8.5
house 8.4
street 8.3
park 8.2
equipment 8.2
tourist 8.2
working 8
building 7.9
grass 7.9
sand 7.9
forest 7.8
icy 7.8
black 7.8
driving 7.7
dirt 7.6
ground 7.6
wheel 7.5
horizontal 7.5
field 7.5
wood 7.5
cloudy 7.5
tourism 7.4
town 7.4
light 7.4
artillery 7.3
countryside 7.3
business 7.3
farm 7.1
day 7.1
architecture 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

fog 94.8
tree 92.3
outdoor 89.1
snow 82.1
black and white 74.5

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Sad 49.7%
Angry 49.7%
Disgusted 49.8%
Calm 49.6%
Confused 49.6%
Surprised 49.6%
Happy 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Angry 49.6%
Calm 50%
Surprised 49.6%
Happy 49.5%
Sad 49.7%
Confused 49.5%
Disgusted 49.6%

AWS Rekognition

Age 27-44
Gender Female, 50.5%
Surprised 49.5%
Disgusted 49.5%
Angry 49.5%
Sad 49.5%
Happy 49.5%
Calm 50.3%
Confused 49.5%

Feature analysis

Amazon

Person 99.3%
Horse 74.9%

Captions

Microsoft

a group of people walking down the street 71.1%
a group of people walking down a street 70%
a group of people walking down a road 69.9%