Human Generated Data

Title

Untitled (three students riding on teacup ride at carnival ride)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10635

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three students riding on teacup ride at carnival ride)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10635

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 97.7
Human 97.7
Clothing 96.7
Apparel 96.7
Chair 96.1
Furniture 96.1
Person 94.7
Person 88.4
Person 80
Vehicle 70.6
Transportation 70.6
Suit 70.3
Overcoat 70.3
Coat 70.3
Robe 70.1
Fashion 70.1
Airplane 70.1
Aircraft 70.1
Road 69.3
Outdoors 68
Face 67.7
Portrait 67.1
Photography 67.1
Photo 67.1
Bridegroom 66.4
Wedding 66.4
Building 64.2
Urban 61.5
Female 61.1
Nature 60.3
City 59.7
Town 59.7
Gown 59.3
Wedding Gown 55.7

Clarifai
created on 2023-10-25

people 99.7
monochrome 97.8
man 96.9
adult 95.7
woman 95.1
chair 93.4
group together 92.5
sitting 92.5
vehicle 91.8
transportation system 91.7
two 89.7
group 89.6
child 88.9
music 87
carousel 86.7
recreation 86.5
mirror 86.4
retro 82.2
three 82.1
wear 81.2

Imagga
created on 2022-01-09

negative 54.4
film 43.6
photographic paper 33
money 24.6
currency 24.2
business 23.1
dollar 22.3
photographic equipment 22.1
finance 21.9
technology 19.3
cash 18.3
financial 17.8
bank 17
banking 16.5
wealth 15.2
design 14.1
equipment 13.1
drawing 13.1
architecture 12.6
symbol 12.1
construction 12
sketch 11.9
modern 11.9
device 10.9
world 10.7
building 10.6
dollars 10.6
bill 10.5
close 10.3
economy 10.2
digital 9.7
bills 9.7
glass 9.6
work 9.4
rich 9.3
global 9.1
one 9
idea 8.9
computer 8.8
banknote 8.7
paper 8.6
exchange 8.6
old 8.4
hundred 7.7
capital 7.7
map 7.6
savings 7.4
vintage 7.4
future 7.4
closeup 7.4
structure 7.4
investment 7.3
machine 7.3
graphic 7.3
conceptual 7
travel 7

Microsoft
created on 2022-01-09

text 99.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 86.1%
Happy 87.1%
Fear 8.5%
Surprised 2.3%
Angry 0.6%
Sad 0.5%
Calm 0.4%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 29-39
Gender Female, 88.6%
Calm 99.6%
Confused 0.2%
Sad 0.1%
Disgusted 0%
Happy 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 69.6%
Calm 53%
Sad 32.5%
Happy 13.3%
Confused 0.8%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Airplane 70.1%

Categories

Imagga

paintings art 99.1%

Captions

Text analysis

Amazon

ARE
COTTO
ARBY
ANDY
yumo ARBY
сот
enjoy!
ap
MIDDLE
e
e CINE DETRE MIDDLE
CINE
DETRE
yumo

Google

LEのカ
LE