Human Generated Data

Title

Untitled (Joe Steinmetz with lasso and cameras)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8659

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Joe Steinmetz with lasso and cameras)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8659

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.1
Human 99.1
Person 98.7
Home Decor 98
Person 97.7
Clothing 87.3
Apparel 87.3
Female 77.1
Toy 75.5
Face 70.6
Shorts 62.3
Portrait 61.7
Photography 61.7
Photo 61.7
Hula 60.5
Sleeve 59.5
Woman 57.9
Play 56.2

Clarifai
created on 2023-10-26

people 99.7
monochrome 99
man 96.8
group 92.4
adult 92.3
two 88.7
technology 88.7
illustration 88
three 87.7
room 86
group together 85.8
interaction 82.4
connection 81.4
indoors 80.3
education 77.3
leader 76.1
science 75.5
wear 73.3
child 72.6
war 72.5

Imagga
created on 2022-01-09

sketch 36.1
drawing 27.9
man 22.8
equipment 22.8
backboard 19.9
representation 19
people 18.4
male 17
person 16.2
work 16.1
business 15.2
lifestyle 14.4
technology 14.1
construction 13.7
architecture 13.3
interior 13.3
urban 13.1
professional 12.4
device 12.4
floor 12.1
office 12
window 11.2
modern 11.2
house 10.9
city 10.8
net 10.7
job 10.6
wicket 10.5
building 10.4
play 10.3
black 10.2
design 10.1
communication 10.1
3d 10.1
digital 9.7
player 9.6
active 9.4
men 9.4
basket 9.4
adult 9.4
hall 9.3
ball 8.8
information 8.8
sport 8.7
wall 8.5
plan 8.5
croquet equipment 8.4
hand 8.3
game 8
worker 8
instrument 7.9
sports equipment 7.9
tool 7.7
project 7.7
crowd 7.7
engineering 7.6
athlete 7.6
silhouette 7.4
company 7.4
holding 7.4
wire 7.3
speed 7.3
competition 7.3
connection 7.3
new 7.3
basketball 7.3
exercise 7.3
futuristic 7.2
transportation 7.2
hair 7.1
women 7.1
working 7.1
businessman 7.1
growth 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.5
person 85.5
clothing 73.1
black and white 64
posing 52.4
old 45.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Happy 95.6%
Sad 1.2%
Surprised 0.9%
Disgusted 0.7%
Angry 0.5%
Calm 0.4%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 50-58
Gender Male, 88.8%
Confused 31.7%
Calm 31.5%
Disgusted 12.5%
Sad 8.4%
Angry 5.9%
Fear 5.9%
Surprised 2.9%
Happy 1.3%

AWS Rekognition

Age 21-29
Gender Male, 75.3%
Calm 70.4%
Confused 12.6%
Sad 9.8%
Angry 1.7%
Fear 1.7%
Disgusted 1.6%
Surprised 1.6%
Happy 0.6%

AWS Rekognition

Age 54-64
Gender Male, 95.9%
Calm 49.1%
Sad 33.2%
Confused 13.6%
Angry 1.4%
Surprised 0.8%
Happy 0.8%
Disgusted 0.7%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Text analysis

Amazon

22788

Google

22788
22788