Human Generated Data

Title

Untitled (two women signing on stage in front of a band)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12166

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women signing on stage in front of a band)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12166

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.5
Person 99.5
Clothing 98.2
Apparel 98.2
Shorts 98.2
Person 97.7
Person 95.1
Person 90.5
Person 84.4
Person 84.3
Person 81.4
Sports 79.8
Sport 79.8
Person 79.4
Person 77.6
People 77.2
Person 75.4
Working Out 70
Exercise 70
Crowd 68.2
Fitness 66.6
Shoe 64.1
Footwear 64.1
Female 62.8
Indoors 60.3
Running 58.5
Path 58.2
Room 58
Person 44.6

Clarifai
created on 2023-10-26

people 99.9
many 97.4
group together 97
man 97
adult 96.9
group 96.8
child 95.9
music 94.6
woman 92.9
monochrome 92.4
musician 90.6
spectator 89.6
crowd 88.3
recreation 85.3
dancing 83
street 82.8
boy 82.5
education 79.4
audience 79
enjoyment 78.3

Imagga
created on 2022-01-22

sport 22.4
people 21.2
male 19.8
group 17.7
man 17.5
men 17.2
sky 15.9
city 15.8
musical instrument 15.7
active 15.1
water 14.7
person 14.4
outdoors 14.2
cello 14.1
building 13.9
athlete 13.8
silhouette 13.2
vacation 13.1
lifestyle 13
bowed stringed instrument 12.9
competition 12.8
exercise 12.7
travel 12.7
activity 12.5
architecture 12.5
adult 12.3
beach 12
dancer 11.8
brass 11.5
stringed instrument 11.5
urban 11.4
speed 11
sea 10.9
summer 10.9
ocean 10.9
fitness 10.8
recreation 10.7
outdoor 10.7
wind instrument 10.6
hall 10.4
leisure 10
team 9.8
river 9.8
crowd 9.6
performer 9.6
walking 9.5
day 9.4
teenager 9.1
fun 9
women 8.7
wheeled vehicle 8.6
motion 8.6
device 8.5
modern 8.4
action 8.3
room 8.3
tourism 8.2
happy 8.1
gymnasium 8.1
professional 8.1
transportation 8.1
business 7.9
scene 7.8
run 7.7
skateboard 7.7
tree 7.7
winter 7.7
clouds 7.6
house 7.5
shore 7.4
ball 7.4
teacher 7.4
chair 7.2
sunset 7.2
portrait 7.1
sand 7.1
work 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.7
person 84.8
clothing 75.5
footwear 69.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 76.2%
Calm 89.8%
Fear 3.3%
Sad 3.3%
Confused 1.7%
Happy 1%
Surprised 0.4%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 22-30
Gender Male, 90.3%
Calm 91.7%
Sad 5.8%
Happy 1.1%
Angry 0.4%
Fear 0.3%
Disgusted 0.3%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 26-36
Gender Female, 96%
Calm 81.6%
Sad 9.2%
Happy 2.9%
Confused 2.4%
Angry 1.4%
Fear 1.3%
Disgusted 0.8%
Surprised 0.4%

AWS Rekognition

Age 29-39
Gender Male, 97.3%
Calm 90.7%
Sad 7.8%
Confused 0.5%
Fear 0.4%
Happy 0.3%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.5%
Shoe 64.1%

Categories

Imagga

paintings art 99%

Text analysis

Amazon

13862
13862.

Google

13862. 13862. ८१६६। HAGON-YT3RA2--MAM1
13862.
८१६६।
HAGON-YT3RA2--MAM1