Human Generated Data

Title

Untitled (man riding bull at rodeo)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4983

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man riding bull at rodeo)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4983

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 97.3
Person 95.9
Bull 95.4
Animal 95.4
Mammal 95.4
Dog 84.4
Canine 84.4
Pet 84.4
Bullfighter 84
Person 82
Person 70.2
Horse 69.3
Rodeo 66.1
Bullfighting 65.4
Person 62.2
Person 62
Person 60
Andalusian Horse 58.3
Person 52.9

Clarifai
created on 2023-10-26

people 99.9
group together 99.4
many 98.6
man 96.9
adult 96.1
monochrome 93.1
competition 92.6
baseball 92.4
athlete 92.3
crowd 91.6
group 90.8
street 90.7
action 90.5
wear 90
one 88
spectator 85.6
motion 85.1
two 84.7
recreation 83.2
transportation system 82.8

Imagga
created on 2022-01-22

shopping cart 100
handcart 100
wheeled vehicle 100
container 77.8
conveyance 39.9
cart 20.5
architecture 20.3
shopping 18.3
building 17.5
city 16.6
sky 16.6
metal 16.1
urban 15.7
empty 15.5
house 14.2
buy 14.1
construction 13.7
business 12.8
shop 12.2
sea 11.7
landscape 11.2
sale 11.1
ocean 10.8
market 10.7
travel 10.6
sand 10.5
push 10.4
retail 10.4
store 10.4
day 10.2
beach 10.1
tree 10
park 9.9
supermarket 9.8
summer 9.6
structure 9.6
trade 9.6
wheel 9.4
winter 9.4
water 9.3
metallic 9.2
transport 9.1
design 9
tower 8.9
trolley 8.9
mall 8.8
scene 8.7
wall 8.5
money 8.5
commercial 8.5
outdoor 8.4
street 8.3
new 8.1
sun 8
river 8
trees 8
drawing 7.8
life 7.8
buying 7.7
basket 7.7
industry 7.7
cityscape 7.6
plan 7.6
tourism 7.4
symbol 7.4
man 7.4
office 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.6
black and white 95.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 98.3%
Calm 88.6%
Happy 5.3%
Sad 2.3%
Disgusted 1.2%
Fear 1%
Angry 0.7%
Surprised 0.7%
Confused 0.3%

AWS Rekognition

Age 34-42
Gender Male, 56.7%
Sad 31.7%
Calm 29.6%
Happy 23.1%
Confused 6.3%
Disgusted 5.7%
Angry 1.6%
Surprised 1.1%
Fear 0.9%

AWS Rekognition

Age 27-37
Gender Male, 90.6%
Calm 48.5%
Sad 33.4%
Confused 6.8%
Angry 3.7%
Happy 2.8%
Surprised 2.5%
Disgusted 1.7%
Fear 0.7%

AWS Rekognition

Age 23-31
Gender Female, 81.2%
Calm 74.7%
Sad 9.7%
Happy 4.8%
Confused 3.2%
Angry 2.6%
Disgusted 2.4%
Surprised 1.9%
Fear 0.9%

Feature analysis

Amazon

Person 95.9%
Dog 84.4%

Categories

Captions

Microsoft
created on 2022-01-22

a close up of a fence 58.1%

Text analysis

Amazon

15515
-
KODVN

Google

15515 A) AL 1 il!
15515
A)
AL
1
il!