Human Generated Data

Title

Untitled (three students riding on teacup ride at carnival ride)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10631

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three students riding on teacup ride at carnival ride)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10631

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Furniture 100
Apparel 99.8
Clothing 99.8
Human 96.9
Person 96.9
Person 94.3
Bridegroom 92.1
Wedding 92.1
Robe 91.4
Fashion 91.4
Female 91.1
Gown 90.3
Dress 90.2
Table 87.3
Suit 86.6
Overcoat 86.6
Coat 86.6
Face 84.2
Wedding Gown 80.9
Bride 78.3
Dining Table 77.6
Woman 77.5
Person 71.4
Chair 70.2
Portrait 68.3
Photography 68.3
Photo 68.3
Man 62.3
Girl 61.8
Shorts 60.9
Shirt 57.2
Transportation 56.6
Vehicle 55.4
Chair 51.4

Clarifai
created on 2023-10-25

people 99.8
chair 99.2
monochrome 98.7
sitting 96.8
sit 96.7
group together 96.1
adult 95.8
man 95.6
seat 95.3
woman 93.4
two 92.3
furniture 91.6
mirror 90.8
group 89.4
vehicle 89.2
transportation system 88.1
child 87.2
indoors 85.6
leader 85.3
three 84.8

Imagga
created on 2022-01-09

helmet 25.5
football helmet 24.3
world 24.3
globe 23.1
technology 21.5
earth 21
equipment 19.2
device 18.2
3d 17.8
global 17.3
digital 17
business 16.4
planet 16.2
headdress 15.2
communication 11.7
film 10.7
negative 10.3
clothing 10.3
data 10
graphics 10
metal 9.6
future 9.3
finance 9.3
network 9.3
travel 9.1
modern 9.1
futuristic 9
space 8.5
money 8.5
three dimensional 8.4
power 8.4
machine 8.3
protection 8.2
computer 8
science 8
information 8
art 7.9
work 7.8
transfer 7.8
render 7.8
glass 7.8
continent 7.8
sky 7.6
web 7.6
building 7.5
sign 7.5
covering 7.5
map 7.5
symbol 7.4
backboard 7.4
connection 7.3
success 7.2
seat 7.2
steel 7.1

Microsoft
created on 2022-01-09

text 98.3
person 70.8
clothing 66.5
black and white 56.4
image 33.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 55.1%
Happy 86.1%
Calm 4.4%
Fear 3.3%
Surprised 1.9%
Sad 1.6%
Angry 1.1%
Disgusted 0.9%
Confused 0.6%

AWS Rekognition

Age 38-46
Gender Male, 87.6%
Calm 98%
Confused 0.6%
Happy 0.4%
Surprised 0.3%
Disgusted 0.2%
Sad 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Male, 60.1%
Happy 95%
Sad 2.6%
Calm 1.1%
Confused 0.5%
Surprised 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%
Chair 70.2%

Categories

Captions

Text analysis

Amazon

34636
ARBY
COTTON
NAV
CANDY
ARBY TO
er
YT33A3
vagoy
FREE
TO

Google

のE9カE
E9
E