Human Generated Data

Title

Untitled (three men posed with racing dog)

Date

1972

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11568

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three men posed with racing dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1972

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11568

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Apparel 99.8
Clothing 99.8
Person 99.6
Human 99.6
Person 99.3
Person 99
Chair 96
Furniture 96
Lighting 95.8
Suit 90.9
Coat 90.9
Overcoat 90.9
Sleeve 73.1
Shorts 71
Long Sleeve 70.4
Clinic 70.2
Female 68.5
Lab Coat 65.8
Table 65.1
Photography 64.9
Photo 64.9
Face 64.4
Portrait 64.4
Crowd 61.7
Man 60.4
Pottery 59.7
Vase 59.7
Potted Plant 59.7
Jar 59.7
Plant 59.7
Outdoors 59.3
Sailor Suit 58.3
Blazer 57.5
Jacket 57.5
Stage 57.2
Shirt 56.3
Tuxedo 55.5
Girl 55.3
Home Decor 55.1
Linen 55.1
Waterfront 55.1
Port 55.1
Pier 55.1
Water 55.1
Dock 55.1

Clarifai
created on 2023-10-25

people 99.8
adult 98.5
man 97.7
monochrome 97.1
woman 89.2
two 84.9
wear 84.3
indoors 82.2
uniform 80.3
side view 79.4
education 76
chair 76
child 75.8
group together 74.7
group 72.2
vehicle 72.1
one 70.4
outfit 70.3
facial expression 67.9
veil 66.6

Imagga
created on 2022-01-15

people 30.1
person 28
man 25.5
adult 21.5
male 20.6
hospital 20.2
health 19.4
men 17.2
interior 16.8
smiling 16.6
lifestyle 16.6
patient 15.5
medical 15
indoors 14.9
nurse 14.5
fitness 14.4
chair 14.3
women 14.2
work 14.1
professional 13.9
portrait 13.6
human 13.5
modern 13.3
brass 12.9
active 12.7
day 12.5
standing 12.2
sport 11.8
exercise 11.8
gym 11.5
office 11.4
happy 11.3
indoor 10.9
business 10.9
holding 10.7
wind instrument 10.3
strength 10.3
training 10.2
case 10.1
uniform 10
worker 9.9
device 9.7
group 9.7
exercising 9.6
exercise bike 9.6
equipment 9.6
room 9.1
together 8.8
happiness 8.6
corporate 8.6
motion 8.6
smile 8.5
two 8.5
life 8.4
weight 8.3
ball 8.3
care 8.2
sword 8
working 7.9
businessman 7.9
sick person 7.9
medicine 7.9
couple 7.8
effort 7.8
hands 7.8
healthy lifestyle 7.8
assistant 7.8
sitting 7.7
pretty 7.7
walk 7.6
club 7.5
senior 7.5
leisure 7.5
camera 7.4
sports equipment 7.3
color 7.2
musical instrument 7.2
exercise device 7.2
home 7.2
team 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 95.4
clothing 93.7
person 89.4
man 67.5
black and white 65.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 100%
Sad 74.3%
Confused 20.3%
Calm 2.3%
Surprised 1%
Disgusted 0.7%
Happy 0.7%
Angry 0.5%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 97.1%
Confused 65.1%
Angry 21.6%
Sad 4.3%
Calm 2.6%
Disgusted 2.6%
Surprised 2%
Happy 1.6%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 98.8%
Calm 22%
Happy 21.7%
Confused 21.5%
Sad 11.6%
Fear 10%
Disgusted 7.4%
Angry 3.3%
Surprised 2.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

interior objects 99.3%

Text analysis

Amazon

SARASOTA
SPEED
CLASSIC
59721

Google

59121 SARASOTA SPEED GLASSIC
SARASOTA
GLASSIC
59121
SPEED