Human Generated Data

Title

Untitled (dog riding on horse around circus ring)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7627

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (dog riding on horse around circus ring)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7627

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 98.6
Person 98.6
Person 97.3
Clothing 93.9
Apparel 93.9
Person 92.1
People 86.8
Horse 84.4
Mammal 84.4
Animal 84.4
Person 78.5
Person 78.1
Furniture 77.9
Chair 77.9
Person 75.5
Building 74.2
Person 74.2
Field 73.9
Person 72.7
Transportation 70.6
Person 70.1
Shorts 69.7
Canopy 69.2
Vehicle 68.9
Person 68.8
Sport 60.2
Sports 60.2
Bicycle 59.1
Bike 59.1
Helmet 58.1
Crowd 57.4
Person 57.1
Tarmac 55.5
Asphalt 55.5
Person 55.3
Team 55
Team Sport 55
Person 43.4

Clarifai
created on 2023-10-25

people 99.8
cavalry 99.7
group together 99.4
monochrome 98.2
many 98
competition 97.5
mammal 95.3
adult 94.6
racehorse 94.5
group 94.2
man 93.9
seated 93.2
motion 92.4
recreation 90.8
athlete 90
two 89.1
one 88.7
action 88.4
three 87.6
street 86.5

Imagga
created on 2022-01-08

sky 25.6
billboard 23.4
clouds 21.1
beach 21
structure 20.9
graffito 19.6
signboard 19
landscape 18.6
travel 17.6
sea 16.6
ocean 16.1
silhouette 15.7
water 15.4
sun 13.7
decoration 13.6
winter 13.6
outdoors 13.4
field 13.4
coast 12.6
cloud 12.1
sunset 11.7
scenic 11.4
sand 11.4
summer 10.9
tree 10.8
snow 10.6
outdoor 9.9
tourism 9.9
vacation 9.8
rural 9.7
fog 9.7
people 9.5
day 9.4
man 9.4
shore 9.4
shopping cart 9.3
building 9.2
wheeled vehicle 9.1
old 9.1
environment 9.1
horizon 9
destruction 8.8
disaster 8.8
architecture 8.6
park 8.6
bay 8.5
cloudy 8.4
black 8.4
evening 8.4
waves 8.4
danger 8.2
scenery 8.1
transportation 8.1
river 8
holiday 7.9
male 7.8
season 7.8
scene 7.8
cold 7.8
handcart 7.7
fishing 7.7
sunrise 7.5
smoke 7.4
equipment 7.4
light 7.4
island 7.3
protection 7.3
grass 7.1
night 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.8
outdoor 92.9
horse 81
black and white 77.5
white 70.3
black 66.9
person 66.1
sky 61.5
vintage 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 98.1%
Happy 46.4%
Calm 25.7%
Disgusted 20.8%
Sad 2.7%
Surprised 1.6%
Fear 1.2%
Confused 0.8%
Angry 0.8%

AWS Rekognition

Age 22-30
Gender Male, 97%
Calm 81.4%
Sad 9%
Angry 2.8%
Surprised 1.6%
Happy 1.5%
Disgusted 1.5%
Confused 1.3%
Fear 1%

AWS Rekognition

Age 23-33
Gender Male, 61.1%
Calm 75.8%
Fear 10.3%
Sad 10.1%
Happy 1.2%
Confused 1%
Surprised 0.7%
Angry 0.4%
Disgusted 0.4%

AWS Rekognition

Age 19-27
Gender Female, 65.9%
Happy 31.2%
Calm 22.6%
Disgusted 18.3%
Fear 16.5%
Sad 4.4%
Confused 2.5%
Surprised 2.3%
Angry 2.2%

AWS Rekognition

Age 13-21
Gender Male, 92%
Sad 56.8%
Calm 21%
Happy 18.5%
Disgusted 1.1%
Angry 0.9%
Confused 0.7%
Fear 0.6%
Surprised 0.4%

AWS Rekognition

Age 10-18
Gender Male, 77.6%
Calm 97.4%
Happy 0.8%
Angry 0.5%
Sad 0.3%
Fear 0.3%
Disgusted 0.3%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 6-14
Gender Female, 84.7%
Calm 61.8%
Fear 23.9%
Sad 9.8%
Disgusted 1.6%
Happy 1.2%
Angry 0.8%
Confused 0.6%
Surprised 0.3%

AWS Rekognition

Age 22-30
Gender Male, 90.3%
Calm 71.2%
Disgusted 15.8%
Sad 5.1%
Happy 3.8%
Fear 1.7%
Angry 1.3%
Surprised 0.6%
Confused 0.4%

AWS Rekognition

Age 25-35
Gender Male, 87.7%
Calm 81.9%
Happy 8.1%
Angry 2.5%
Fear 2.3%
Confused 1.6%
Disgusted 1.5%
Sad 1.5%
Surprised 0.6%

AWS Rekognition

Age 21-29
Gender Male, 94.7%
Happy 32.1%
Disgusted 24.9%
Calm 23.6%
Sad 7.9%
Confused 5%
Fear 2.4%
Surprised 2.1%
Angry 2.1%

AWS Rekognition

Age 14-22
Gender Male, 96.1%
Calm 56.7%
Sad 15.3%
Happy 8.3%
Fear 7.4%
Disgusted 4.5%
Angry 4.2%
Surprised 2.3%
Confused 1.2%

AWS Rekognition

Age 16-22
Gender Male, 91.8%
Calm 65.9%
Sad 20.7%
Disgusted 3.9%
Fear 2.7%
Confused 2.4%
Happy 2.2%
Angry 1.5%
Surprised 0.7%

AWS Rekognition

Age 23-33
Gender Female, 61%
Calm 83.4%
Sad 8.9%
Confused 2.8%
Surprised 1.3%
Happy 1.2%
Fear 0.9%
Disgusted 0.8%
Angry 0.7%

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Happy 35.6%
Sad 22.7%
Calm 17.8%
Angry 9%
Disgusted 5.2%
Fear 3.9%
Surprised 3%
Confused 2.8%

AWS Rekognition

Age 23-31
Gender Female, 55.5%
Happy 48.5%
Calm 16.6%
Sad 15%
Confused 7.1%
Disgusted 4.6%
Fear 3.3%
Angry 2.4%
Surprised 2.4%

AWS Rekognition

Age 23-33
Gender Male, 81.6%
Calm 93.3%
Sad 3%
Fear 1.8%
Angry 0.7%
Surprised 0.5%
Happy 0.4%
Confused 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 98.6%
Horse 84.4%

Categories

Captions

Microsoft
created on 2022-01-08

a vintage photo of a person 73.2%

Text analysis

Amazon

د8
8.5.58E
YТ37А°-

Google

58 YT37A°2- AGO 38SS8
58
YT37A°2-
AGO
38SS8