Human Generated Data

Title

[Figures on ship deck, looking over railing]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1007.83

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Figures on ship deck, looking over railing]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1007.83

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Clothing 100
Photography 100
Water 99.9
Waterfront 99.9
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99
Male 99
Man 99
Person 99
Adult 99
Male 99
Man 99
Person 99
Person 98.7
Person 98.5
Person 98.4
Person 97
Coat 96.5
Outdoors 92.7
Nature 90.4
Sea 90.4
Jacket 83.4
Face 83.4
Head 83.4
Formal Wear 78.4
Suit 78.4
Portrait 68.8
Pier 65.1
Hat 61.5
Sky 57.6
Transportation 57.5
Vehicle 57.5
Yacht 57.5
Architecture 57.5
Building 57.5
Deck 57.5
House 57.5
Housing 57.5
Porch 57.5
Boardwalk 56.8
Bridge 56.8
Baseball Cap 56.7
Cap 56.7
Blazer 56.6
Railing 56.5
Overcoat 55.7
Vest 55.6
Pants 55.3
Boat 55.2
Ferry 55.2
Ship 55.2
Shoreline 55.2
Path 55

Clarifai
created on 2023-10-15

people 99.6
group 98.7
man 95.9
adult 95.2
child 93
woman 92.3
group together 90.3
boy 88.2
family 87.1
street 85.3
wear 85.2
offspring 83.9
transportation system 83.3
three 82.3
portrait 81.1
window 80.2
vehicle 78.8
two 78.7
recreation 78.2
four 77.6

Imagga
created on 2019-01-31

people 29
man 27.5
male 25.6
person 24
world 24
silhouette 22.3
business 19.4
room 19.4
men 18
musical instrument 17.2
adult 16.9
classroom 16.9
office 16.3
group 16.1
black 15.7
businessman 15
life 13.4
indoor 12.8
marimba 12.6
percussion instrument 12.4
couple 12.2
love 11.8
sitting 11.2
women 11.1
suit 11
team 10.7
night 10.6
corporate 10.3
work 10.2
computer 10.2
teacher 9.9
working 9.7
looking 9.6
child 9.5
education 9.5
meeting 9.4
relax 9.3
dark 9.2
window 9.2
alone 9.1
board 9
job 8.8
indoors 8.8
professional 8.7
lifestyle 8.7
table 8.7
laptop 8.6
adults 8.5
modern 8.4
communication 8.4
spectator 8.3
technology 8.2
chair 8.2
sunset 8.1
light 8
building 8
home 8
together 7.9
youth 7.7
clothing 7.6
wind instrument 7.6
career 7.6
relaxation 7.5
leisure 7.5
teamwork 7.4
back 7.3
book 7.3
screen 7.3
music 7.2
interior 7.1
day 7.1

Google
created on 2019-01-31

Microsoft
created on 2019-01-31

person 99.4
window 97.6
black and white 44
street 21.2
monochrome 16.2
people 13.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Female, 69.6%
Calm 57.5%
Happy 17.8%
Sad 11.5%
Surprised 10.1%
Fear 6.2%
Confused 2.9%
Angry 1.2%
Disgusted 1.2%

AWS Rekognition

Age 33-41
Gender Male, 98.4%
Calm 99%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%
Confused 0%

AWS Rekognition

Age 6-16
Gender Male, 76.5%
Calm 47.3%
Disgusted 12.1%
Fear 9%
Surprised 8.7%
Sad 8.5%
Happy 8.3%
Confused 7.9%
Angry 3.3%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Calm 85.1%
Confused 7.3%
Surprised 6.4%
Fear 6%
Sad 4.6%
Angry 0.8%
Happy 0.5%
Disgusted 0.3%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Coat 96.5%

Categories

Imagga

events parties 95.6%
people portraits 4.2%