Human Generated Data

Title

Untitled (groomsmen standing in line near trees)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8443

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (groomsmen standing in line near trees)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8443

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.4
Person 99.4
Person 99.2
Clothing 99.2
Apparel 99.2
Person 99.2
Person 98.8
Person 98.8
Person 98.7
Person 98.7
Person 98.7
Person 98.7
Person 98.6
Person 98.6
Person 98.3
Person 97.3
Suit 97.2
Overcoat 97.2
Coat 97.2
Person 97.2
Person 97.2
Female 89.5
Dress 87.3
Fashion 85.4
Gown 85.4
Plant 85.3
Tree 85.3
Robe 83.5
Wedding 73.7
People 73.3
Woman 73.1
Evening Dress 72.2
Tuxedo 71.5
Wedding Gown 68.5
Face 66.5
Photo 61.6
Portrait 61.6
Photography 61.6
Fence 61.1
Bridegroom 58.3
Outdoors 55.8

Clarifai
created on 2023-10-26

people 99.8
group together 98.8
many 97.9
adult 97.8
man 97.6
group 97.4
woman 93.3
crowd 86.1
wear 85.4
desktop 80
monochrome 77.8
child 77.4
uniform 76.6
recreation 73.9
no person 73.7
military 67.8
administration 67.4
leader 66.2
street 64.8
portrait 64.1

Imagga
created on 2022-01-15

picket fence 100
fence 100
barrier 100
obstruction 70.2
structure 40.5
landscape 21.6
sky 19.1
architecture 18
stone 17.4
old 16
building 15.9
travel 15.5
memorial 15.5
tree 13.1
black 12.6
city 12.5
rock 12.2
ancient 12.1
outdoors 11.9
sun 11.3
antique 11.2
grass 11.1
gravestone 10.8
landmark 10.8
scenery 10.8
park 10.7
wall 10.6
winter 10.2
clouds 10.1
natural 10
snow 10
tourism 9.9
trees 9.8
art 9.8
rural 9.7
scenic 9.7
scene 9.5
monument 9.3
national 9.1
summer 9
history 8.9
country 8.8
field 8.4
house 8.4
wood 8.3
countryside 8.2
tranquil 8.1
group 8.1
mountain 8
ice 7.7
street 7.4
water 7.3
historic 7.3
peaceful 7.3
design 7.3
horizon 7.2
meadow 7.2
sea 7

Google
created on 2022-01-15

Dress 91.4
Human body 88.8
Gesture 85.3
Headgear 83.1
Plant 82
Adaptation 79.3
Font 78.5
Tree 77.5
Team 74.5
Uniform 73
Event 72.4
Team sport 69.4
Rectangle 68.4
Grass 67.8
Vintage clothing 65
Stock photography 62.5
Paper product 61.4
History 60.4
Room 60.2
Monochrome 60

Microsoft
created on 2022-01-15

text 94.9
black and white 88.8
black 85
white 68.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 93.7%
Calm 95%
Happy 1.5%
Sad 0.9%
Fear 0.8%
Surprised 0.5%
Confused 0.5%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 39-47
Gender Female, 86.5%
Happy 97.8%
Calm 1.4%
Fear 0.5%
Surprised 0.1%
Sad 0.1%
Angry 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 37-45
Gender Male, 99.7%
Calm 86.4%
Happy 10.2%
Fear 1%
Confused 0.6%
Sad 0.6%
Surprised 0.5%
Disgusted 0.4%
Angry 0.4%

AWS Rekognition

Age 30-40
Gender Male, 70.2%
Happy 94.8%
Calm 2.3%
Fear 1.7%
Surprised 0.3%
Disgusted 0.3%
Sad 0.2%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 35-43
Gender Female, 50.7%
Happy 99.5%
Calm 0.1%
Surprised 0.1%
Confused 0.1%
Sad 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 96.5%
Fear 61.1%
Happy 20%
Sad 6.5%
Confused 3.7%
Calm 3.3%
Surprised 2.5%
Disgusted 1.4%
Angry 1.3%

AWS Rekognition

Age 41-49
Gender Male, 99.4%
Calm 99.9%
Happy 0.1%
Disgusted 0%
Surprised 0%
Confused 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 99.4%
Calm 90.4%
Happy 3.1%
Sad 2.4%
Angry 2.2%
Fear 0.8%
Disgusted 0.6%
Surprised 0.3%
Confused 0.3%

AWS Rekognition

Age 52-60
Gender Female, 51.8%
Calm 97.8%
Happy 1.8%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0%
Sad 0%

AWS Rekognition

Age 43-51
Gender Male, 82.2%
Calm 83.5%
Happy 15.4%
Surprised 0.3%
Fear 0.2%
Sad 0.2%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 40-48
Gender Female, 92.9%
Happy 53.4%
Calm 44.8%
Disgusted 0.7%
Fear 0.4%
Surprised 0.4%
Confused 0.2%
Sad 0.2%
Angry 0.1%

AWS Rekognition

Age 49-57
Gender Male, 91.3%
Calm 87.8%
Happy 3.2%
Confused 2%
Surprised 1.8%
Disgusted 1.8%
Angry 1.4%
Sad 1.4%
Fear 0.6%

AWS Rekognition

Age 27-37
Gender Male, 62.6%
Calm 99.8%
Fear 0.1%
Happy 0%
Sad 0%
Surprised 0%
Disgusted 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 49-57
Gender Female, 62.4%
Calm 84.8%
Happy 9.2%
Sad 2.5%
Confused 1%
Surprised 0.9%
Disgusted 0.6%
Fear 0.5%
Angry 0.5%

AWS Rekognition

Age 31-41
Gender Female, 89.5%
Calm 97.9%
Fear 0.7%
Surprised 0.4%
Happy 0.3%
Disgusted 0.2%
Sad 0.2%
Confused 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

12573
12573.
AF

Google

NAOON- YT3RA2-NAM1
NAOON-
YT3RA2-NAM1