Human Generated Data

Title

Untitled (man playing croquet surrounded by seated spectators)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5557

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man playing croquet surrounded by seated spectators)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5557

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99
Person 98.9
Person 98.6
Sport 97.7
Sports 97.7
Person 97.2
Person 97
Person 96.3
Person 94.7
Person 92.4
Croquet 84.4
Person 74.1
Person 67.6
Cricket 56.6

Clarifai
created on 2023-10-26

people 98.8
sport 94.5
man 93.8
monochrome 92
adult 89.1
exercise 88
competition 87.6
recreation 85.9
athlete 84.7
active 82.4
child 77.5
street 77.2
fun 76.2
group together 75.1
karate 74.2
education 73.9
position 72
young 71.4
sports equipment 71.3
action energy 70

Imagga
created on 2022-01-23

negative 100
film 100
photographic paper 80.6
photographic equipment 53.7
silhouette 29
people 28.4
businessman 28.2
man 25.5
crowd 24.9
business 24.9
work 22.8
person 22.3
male 22
team 21.5
audience 20.5
cheering 19.6
stadium 19.5
flag 19.4
patriotic 19.2
nation 18.9
symbol 18.8
nighttime 18.6
design 18.6
lights 18.5
icon 18.2
job 17.7
teamwork 17.6
vibrant 17.5
bright 17.1
boss 16.3
presentation 15.8
businesswoman 15.4
leader 15.4
vivid 14.9
occupation 14.7
sexy 14.4
ice 14.3
modern 14
speech 13.7
supporters 12.8
president 12.8
technology 11.1
sport 10.7
digital 10.5
group 10.5
chart 9.6
men 9.4
city 9.1
suit 9
urban 8.7
motion 8.6
athlete 8.5
adult 8.4
success 8
financial 8
working 8
lifestyle 7.9
corporate 7.7
skill 7.7
graph 7.7
player 7.5
human 7.5
manager 7.4
document 7.4
event 7.4
graphic 7.3
global 7.3
office 7.2
conceptual 7
architecture 7
sky 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 78%
Calm 98.3%
Sad 0.6%
Happy 0.4%
Surprised 0.2%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0%

AWS Rekognition

Age 30-40
Gender Female, 70.3%
Confused 51.7%
Happy 17.2%
Calm 16.3%
Sad 5.9%
Surprised 3.8%
Angry 2.2%
Disgusted 1.5%
Fear 1.4%

AWS Rekognition

Age 31-41
Gender Male, 65.1%
Happy 35.9%
Calm 28.9%
Fear 24.2%
Sad 4.1%
Angry 3%
Disgusted 1.5%
Surprised 1.3%
Confused 1.1%

AWS Rekognition

Age 18-26
Gender Male, 63.3%
Calm 27.7%
Sad 19%
Happy 13.3%
Fear 11.1%
Surprised 10.6%
Angry 8.1%
Confused 7.4%
Disgusted 2.8%

AWS Rekognition

Age 20-28
Gender Female, 51.3%
Calm 67.5%
Sad 11.2%
Confused 6.1%
Fear 4.9%
Surprised 3.3%
Angry 2.9%
Disgusted 2.1%
Happy 1.9%

AWS Rekognition

Age 21-29
Gender Female, 85.1%
Calm 59.8%
Surprised 18.3%
Fear 7.5%
Sad 3.9%
Angry 3.8%
Disgusted 2.8%
Happy 2.1%
Confused 1.9%

AWS Rekognition

Age 33-41
Gender Female, 77.4%
Happy 67.3%
Calm 26.4%
Angry 1.9%
Sad 1.5%
Surprised 0.8%
Fear 0.8%
Confused 0.8%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 98.9%

Categories

Captions

Microsoft
created on 2022-01-23

a man holding a baseball bat 35.6%
a man with a baseball bat 35.5%

Text analysis

Amazon

23002
28'
20092

Google

23002 23002
23002