Human Generated Data

Title

Untitled (man playing croquet surrounded by seated spectators)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5556

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man playing croquet surrounded by seated spectators)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5556

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.5
Human 98.5
Person 96.8
Person 96.2
Person 95.4
Person 91.1
Person 88.6
Person 87.7
Person 85.4
Person 83.7
Nature 82.3
Outdoors 79.9
Room 76.2
Indoors 76.2
Crowd 71
Person 69.7
Building 69.5
Person 69.2
People 68.2
Stage 66.4
Leisure Activities 61.2
Hall 56.6
Person 55.9
Sport 55.6
Sports 55.6
Architecture 55
Dance 55
Person 42.6

Clarifai
created on 2023-10-27

people 99.2
architecture 98.7
snow 98.5
street 98.5
monochrome 98.2
winter 97.5
city 95.8
building 95.4
child 95.4
dog 94.2
family 92.5
black and white 92.1
man 91.7
house 91.5
square 91.3
woman 91.1
adult 90.9
frost 88.8
town 88.7
many 86.5

Imagga
created on 2022-01-23

negative 44.1
film 36.7
photographic paper 26.9
business 26.1
architecture 22
city 21.7
urban 19.2
photographic equipment 17.9
modern 17.5
reflection 17.3
building 16
office 15.8
design 15.7
people 15.1
interior 15
construction 14.5
silhouette 14.1
hall 13.1
light 12.7
work 12.6
businessman 12.4
snow 12.2
structure 12.2
group 12.1
winter 11.9
ice 11.3
floor 11.2
house 10.9
wagon 10.7
crowd 10.6
window 10.3
motion 10.3
sky 10.2
glass 10.1
station 9.9
corridor 9.8
space 9.3
finance 9.3
digital 8.9
man 8.7
wheeled vehicle 8.7
scene 8.7
day 8.6
blurred 8.6
men 8.6
walking 8.5
perspective 8.5
travel 8.4
outdoor 8.4
facility 8.4
blur 8.4
occupation 8.2
home 8.2
landscape 8.2
person 8.1
water 8
car 8
job 8
adult 7.8
male 7.8
cold 7.7
corporate 7.7
wall 7.7
way 7.7
industry 7.7
room 7.7
screen 7.5
human 7.5
symbol 7.4
speed 7.3
transport 7.3
graphic 7.3
weather 7.3
transportation 7.2
team 7.2
transparent 7.2
idea 7.1
gymnasium 7.1

Google
created on 2022-01-23

Window 93.6
Black 89.6
Building 87.2
Black-and-white 86.7
World 86
Style 84
Line 82.1
Font 81
Adaptation 79.3
Art 78.4
Monochrome photography 75.5
Monochrome 75.1
Beauty 75
Snapshot 74.3
Rectangle 73.9
Room 70.7
Symmetry 68.7
Design 68.4
Flooring 68
Arch 67.5

Microsoft
created on 2022-01-23

text 99.3
black and white 70.3
building 64.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 62.2%
Calm 90.9%
Sad 4.3%
Happy 2.1%
Fear 0.8%
Confused 0.5%
Angry 0.5%
Disgusted 0.4%
Surprised 0.4%

AWS Rekognition

Age 20-28
Gender Male, 64%
Happy 55.3%
Calm 18.2%
Fear 9.2%
Sad 6.8%
Surprised 3.1%
Angry 2.9%
Confused 2.5%
Disgusted 2.1%

AWS Rekognition

Age 28-38
Gender Male, 93.8%
Calm 81.2%
Sad 9%
Angry 3.4%
Confused 2.4%
Happy 1.5%
Disgusted 0.9%
Surprised 0.9%
Fear 0.6%

AWS Rekognition

Age 26-36
Gender Male, 67.3%
Calm 86.4%
Happy 5.5%
Sad 5.4%
Angry 0.8%
Disgusted 0.5%
Confused 0.5%
Fear 0.5%
Surprised 0.4%

AWS Rekognition

Age 19-27
Gender Female, 89.8%
Calm 91.2%
Happy 3.7%
Sad 2.8%
Confused 1.2%
Fear 0.4%
Angry 0.4%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 24-34
Gender Female, 91.5%
Calm 93.4%
Confused 2.8%
Sad 1.4%
Happy 0.8%
Angry 0.6%
Fear 0.4%
Surprised 0.3%
Disgusted 0.2%

Feature analysis

Amazon

Person 98.5%

Categories

Text analysis

Amazon

23003
٤٥٥٤٦

Google

值| 6 23003
|
6
23003