Human Generated Data

Title

[Figures seated outside]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.152

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Figures seated outside]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.152

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Grass 100
Plant 100
Architecture 100
Building 100
Outdoors 100
Shelter 100
Dining Table 100
Furniture 100
Table 100
Indoors 99.5
Restaurant 99.5
Backyard 99.3
Nature 99.3
Yard 99.3
Lawn 99.2
Person 98.9
House 98.8
Housing 98.8
Patio 98.8
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Face 98.6
Head 98.6
Photography 98.6
Portrait 98.6
Person 98.4
Child 98.4
Female 98.4
Girl 98.4
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 98.1
Adult 98.1
Male 98.1
Man 98.1
Person 96.8
Chair 95.8
People 94.4
Person 90.6
Chair 90.3
Vegetation 88.8
Summer 75.8
Person 75.4
Porch 74.6
Canopy 71.9
Dining Room 67
Room 67
Cafeteria 57.3
Tree 57.3
Cafe 56.4
Fun 56.4
Leisure Activities 56.4
Picnic 56.4
Food 56.2
Food Court 56.2
Patio Umbrella 56
Park 55.9
Sitting 55.7
Clothing 55.5
Undershirt 55.5
Shorts 55.5
Deck 55.2
Footwear 55.2
Shoe 55.2
Grove 55.2
Land 55.2
Woodland 55.2

Clarifai
created on 2023-10-14

people 99.7
man 98.6
chair 98.6
woman 98.2
group 98.2
recreation 97.7
restaurant 97.6
adult 97.5
group together 95
furniture 93.9
table 93.4
umbrella 91.7
picnic 91.3
bar 91.3
many 90.4
several 90.1
bistro 89.5
coffee 88.2
seat 88.1
dining 87.4

Imagga
created on 2019-01-31

chair 38.5
table 29.8
restaurant 23.6
park 23.4
chairs 21.5
man 21.5
seat 18.2
sitting 17.2
outdoors 16.7
people 16.7
building 15.8
tables 14.8
tract 14.6
outdoor 14.5
spectator 14.4
person 13.4
patio 13.2
male 12.8
interior 12.4
lifestyle 12.3
vacation 12.3
summer 12.2
resort 11.8
hotel 11.5
dining 11.4
adult 11.3
dinner 11.2
outside 11.1
center 11
beach 11
couple 10.4
business 10.3
luxury 10.3
area 10.2
room 10.2
relax 10.1
umbrella 10.1
lunch 9.9
group 9.7
sun 9.7
together 9.6
sky 9.6
cafeteria 9.5
sea 9.4
structure 9.3
glass 9.3
place 9.3
hall 9.3
furniture 9.2
meal 9.1
relaxing 9.1
work 8.7
women 8.7
barroom 8.5
modern 8.4
eat 8.4
relaxation 8.4
coffee 8.3
bar 8.3
leisure 8.3
computer 8.2
happy 8.1
laptop 8
deck 7.9
life 7.8
scene 7.8
travel 7.7
casual 7.6
vacations 7.5
house 7.5
drink 7.5
friends 7.5
city 7.5
fun 7.5
ocean 7.5
stall 7.4
tourism 7.4
student 7.4
wine 7.4
water 7.3
island 7.3
smiling 7.2
home 7.2
smile 7.1
office 7.1
working 7.1
day 7.1
architecture 7
indoors 7

Google
created on 2019-01-31

Leisure 63.9
Event 62.7
Sitting 62.5
Table 58.6

Microsoft
created on 2019-01-31

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Female, 86%
Calm 66.5%
Sad 47.2%
Surprised 6.5%
Fear 6.2%
Angry 4.5%
Disgusted 0.8%
Confused 0.6%
Happy 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 94.7%
Surprised 6.3%
Fear 5.9%
Sad 3.1%
Confused 1.6%
Angry 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 34-42
Gender Male, 100%
Calm 58%
Happy 11.5%
Confused 10.7%
Surprised 8.7%
Fear 6.9%
Angry 6%
Sad 4.3%
Disgusted 2.1%

AWS Rekognition

Age 38-46
Gender Male, 100%
Sad 97.5%
Calm 17.2%
Confused 8%
Surprised 7.8%
Disgusted 7.3%
Fear 7.2%
Happy 7%
Angry 2.2%

AWS Rekognition

Age 45-53
Gender Male, 97.2%
Happy 77.4%
Calm 20.9%
Surprised 6.6%
Fear 5.9%
Sad 2.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.2%

AWS Rekognition

Age 45-51
Gender Male, 88.7%
Calm 96.3%
Surprised 6.3%
Fear 6%
Sad 3.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Adult 98.7%
Male 98.7%
Man 98.7%
Child 98.4%
Female 98.4%
Girl 98.4%
Chair 95.8%
Shoe 55.2%

Categories