Human Generated Data

Title

Untitled (crowd standing around small boats, Mantalocking, NJ)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8506

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowd standing around small boats, Mantalocking, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8506

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.9
Human 99.9
Person 99.6
Person 99.5
Person 97.4
Person 96
Person 88
Person 86.2
Boat 85
Transportation 85
Vehicle 85
Person 74.7
Cafeteria 67.3
Restaurant 67.3
Shorts 60.2
Clothing 60.2
Apparel 60.2

Clarifai
created on 2023-10-25

people 99.4
man 97.1
woman 95.9
group together 94.2
adult 93.2
music 93.1
sitting 90.9
monochrome 90.5
group 89.9
many 89.6
indoors 88.7
instrument 85.8
musician 85.4
street 83.7
child 81.9
audience 81.5
crowd 81.5
violin 79.8
military 79.7
vehicle 79.2

Imagga
created on 2022-01-09

man 26.9
adult 25
people 24.5
person 22
violin 19.8
bowed stringed instrument 19.2
smiling 18.8
happy 18.8
lifestyle 18.8
interior 17.7
musical instrument 17.2
male 16.4
sitting 16.3
cheerful 16.2
stringed instrument 15.8
stage 14.3
table 14.1
indoor 13.7
portrait 12.9
smile 12.8
chair 12.7
happiness 12.5
indoors 12.3
business 12.1
blond 12.1
home 12
women 11.9
two 11.8
work 11.8
men 11.2
worker 10.8
platform 10.8
looking 10.4
brass 9.9
attractive 9.8
working 9.7
businessman 9.7
one 9.7
group 9.7
couple 9.6
shop 9.5
casual 9.3
leisure 9.1
health 9
black 9
equipment 9
boy 8.7
senior 8.4
human 8.2
confident 8.2
transportation 8.1
sexy 8
family 8
face 7.8
pretty 7.7
vehicle 7.7
sit 7.6
communication 7.6
professional 7.5
joy 7.5
enjoyment 7.5
life 7.4
wind instrument 7.4
vacation 7.4
classroom 7.3
transport 7.3
trombone 7.3
laptop 7.3
student 7.2
music 7.2
body 7.2
passenger 7.1
machine 7.1
day 7.1
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

ship 95.3
person 94.1
black and white 92.4
text 91.1
indoor 88.3
clothing 85.5
man 72.9
monochrome 53

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 71.3%
Calm 86.3%
Happy 11.2%
Sad 1%
Confused 0.4%
Fear 0.4%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%

AWS Rekognition

Age 28-38
Gender Male, 92.8%
Happy 75.7%
Calm 21.7%
Sad 1.8%
Confused 0.2%
Angry 0.2%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 34-42
Gender Male, 73.5%
Sad 96.5%
Calm 1.2%
Fear 0.6%
Happy 0.5%
Surprised 0.5%
Angry 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Boat 85%

Text analysis

Amazon

17330
SA
38
A
MURP
17330.
SA co
MURP ЧЕН
co
EVEETA
NAGOY
ЧЕН

Google

17330· HUHP YE SA CO 17330.
17330·
HUHP
YE
SA
CO
17330.