Human Generated Data

Title

[Group of figures standing by road signs, near Falls Village, Connecticut]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1004.128

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Group of figures standing by road signs, near Falls Village, Connecticut]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1004.128

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-23

Clothing 100
Adult 99.1
Female 99.1
Person 99.1
Woman 99.1
Adult 99.1
Person 99.1
Male 99.1
Man 99.1
Adult 99.1
Female 99.1
Person 99.1
Woman 99.1
Person 98.7
Person 98
Overcoat 87.5
Outdoors 84.2
Croquet 78.1
Sport 78.1
Face 77.2
Head 77.2
Footwear 74.4
Shoe 74.4
Coat 71.9
Nature 68.2
Shoe 68
Grass 62
Plant 62
Dress 61.5
Accessories 57.3
Bag 57.3
Handbag 57.3
People 56.5
Garden 56.2
Bus Stop 56
Walking 55.9
Gardening 55.7
Hat 55.4

Clarifai
created on 2023-10-15

people 99.9
monochrome 99.9
child 99.7
street 97.8
family 97.7
woman 97.2
adult 96.9
group together 95.5
boy 95.4
group 95.1
man 92.6
documentary 92.2
wear 91.1
black and white 90.4
two 88
portrait 87.2
three 87.1
son 87
offspring 87
administration 86

Imagga
created on 2019-02-03

world 27.5
silhouette 24.8
man 24.2
athlete 21.9
person 21.3
structure 21.1
billboard 21
sunset 19.8
people 19
beach 18.6
sky 17.9
signboard 16.3
water 16
male 15.6
outdoor 15.3
summer 14.8
sun 14.5
player 14.4
black 14.1
landscape 13.4
runner 12.9
peaceful 12.8
contestant 12.2
fountain 12.1
outdoors 11.9
sport 11.6
life 11.6
park 11.5
adult 11.1
sea 10.9
dark 10.9
ocean 10.8
light 10.7
vacation 10.6
serene 10.4
men 10.3
evening 10.3
leisure 10
travel 9.9
clouds 9.3
field 9.2
lake 9.1
recreation 9
art 8.5
tree 8.5
portrait 8.4
vintage 8.3
human 8.2
retro 8.2
horizon 8.1
trees 8
scenic 7.9
couple 7.8
forest 7.8
ballplayer 7.7
happy 7.5
free 7.5
fun 7.5
tourism 7.4
lights 7.4
environment 7.4
alone 7.3
dirty 7.2
coast 7.2
sunlight 7.1
grass 7.1
night 7.1
love 7.1
happiness 7.1

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

outdoor 96.2
standing 93.2
person 92.4
posing 59.8
old 52.1
black and white 37.8
boy 12.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 99.7%
Happy 58.8%
Calm 38.6%
Surprised 6.3%
Fear 6.1%
Sad 2.3%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 25-35
Gender Female, 80.4%
Calm 77.8%
Fear 8.1%
Surprised 6.9%
Sad 6.5%
Disgusted 3.5%
Confused 2.5%
Angry 1.1%
Happy 0.7%

AWS Rekognition

Age 25-35
Gender Female, 94%
Calm 57.4%
Happy 21.5%
Sad 8.9%
Surprised 6.8%
Fear 6.7%
Angry 2.9%
Confused 2.6%
Disgusted 1.9%

Microsoft Cognitive Services

Age 41
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Female 99.1%
Person 99.1%
Woman 99.1%
Male 99.1%
Man 99.1%
Shoe 74.4%
Coat 71.9%

Categories

Text analysis

Amazon

TORRINGTON
WES
WALL
HEN

Google

TORRINGTON
WALL
TORRINGTON WALL