Human Generated Data

Title

Untitled (women posed around table outdoors at wedding event)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9272

Human Generated Data

Title

Untitled (women posed around table outdoors at wedding event)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9272

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.6
Apparel 99.6
Human 99.4
Person 99.4
Person 99.1
Person 99
Person 98.9
Person 98.6
Person 98.5
Person 98.3
Person 97.8
Person 96.9
Person 96
Car 92.4
Transportation 92.4
Vehicle 92.4
Automobile 92.4
Person 89.8
Female 89.7
Chair 87.8
Furniture 87.8
Face 85.6
Wheel 85.1
Machine 85.1
Dress 84.2
Person 82.1
Home Decor 79.8
Person 78.2
Person 78.1
Woman 77.1
Icing 75
Food 75
Dessert 75
Cake 75
Cream 75
Creme 75
Person 72
People 71.5
Collage 70.8
Poster 70.8
Advertisement 70.8
Meal 70.7
Crowd 69
Hat 66.4
Photography 66.1
Photo 66.1
Portrait 66.1
Girl 63
Person 62.5
Linen 59.2
Leisure Activities 58.2
Sitting 56.2
Person 55.2

Clarifai
created on 2023-10-26

people 99.8
group 99.2
adult 98.8
woman 98.4
many 97.7
man 97.7
group together 96.8
child 93.7
administration 91.9
wear 91.3
education 90.7
leader 89
medical practitioner 87.9
furniture 86
several 84.3
war 83.9
military 83.3
recreation 82
indoors 81.5
sit 80.9

Imagga
created on 2022-01-23

man 33.7
businessman 30.9
people 30.7
business 30.4
male 29.1
person 28.3
office 23.4
nurse 22.5
work 19.7
working 19.4
team 18.8
meeting 17.9
room 17.4
adult 17.4
worker 15.1
table 14.7
computer 14.4
businesspeople 14.2
teamwork 13.9
men 13.7
group 13.7
businesswoman 13.6
world 13.3
talking 13.3
corporate 12.9
job 12.4
smiling 12.3
professional 12.1
executive 12.1
sitting 12
happy 11.9
conference 11.7
discussion 11.7
desk 11.3
laptop 11.1
together 10.5
portrait 10.3
paper 10
human 9.7
indoors 9.7
30s 9.6
couple 9.6
home 9.6
student 9.3
manager 9.3
communication 9.2
suit 9
color 8.9
colleagues 8.7
photographer 8.6
casual 8.5
doctor 8.5
old 8.4
20s 8.2
life 8.2
new 8.1
hospital 8.1
women 7.9
discussing 7.9
client 7.8
day 7.8
document 7.6
finance 7.6
plan 7.6
patient 7.6
camera 7.4
cheerful 7.3
uniform 7.2
smile 7.1
medical 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.8
text 97.5
clothing 90
black and white 52.4
drawing 52.1
clothes 20.4
crowd 0.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Female, 75.2%
Sad 50.6%
Happy 30.1%
Confused 8.7%
Calm 5%
Surprised 3.2%
Fear 1%
Disgusted 0.9%
Angry 0.6%

AWS Rekognition

Age 27-37
Gender Male, 96.5%
Calm 42.4%
Surprised 28.8%
Happy 25.1%
Sad 1.3%
Disgusted 0.9%
Angry 0.7%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 37-45
Gender Female, 54.2%
Calm 49.7%
Surprised 41.7%
Happy 5.3%
Sad 1.3%
Disgusted 0.9%
Confused 0.8%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Female, 99.6%
Happy 81%
Calm 16.5%
Sad 0.8%
Surprised 0.6%
Confused 0.4%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 42-50
Gender Male, 65%
Surprised 66.4%
Calm 25.5%
Happy 6.8%
Disgusted 0.5%
Sad 0.3%
Confused 0.3%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 34-42
Gender Male, 99.6%
Happy 98.7%
Calm 1.2%
Surprised 0%
Confused 0%
Disgusted 0%
Sad 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 35-43
Gender Female, 88.1%
Surprised 37.4%
Calm 34.1%
Happy 10%
Sad 6.9%
Confused 5.2%
Angry 2.9%
Disgusted 2.3%
Fear 1.3%

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Sad 93.2%
Calm 3.7%
Confused 2.1%
Disgusted 0.4%
Happy 0.3%
Angry 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 36-44
Gender Female, 54.9%
Happy 48.7%
Calm 47.7%
Sad 2.8%
Disgusted 0.2%
Confused 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Female, 75.9%
Calm 93.5%
Sad 2.1%
Fear 1%
Happy 1%
Disgusted 0.8%
Confused 0.5%
Angry 0.5%
Surprised 0.5%

AWS Rekognition

Age 54-64
Gender Female, 74.5%
Happy 99.4%
Sad 0.3%
Calm 0.1%
Surprised 0.1%
Angry 0%
Disgusted 0%
Fear 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Car 92.4%
Wheel 85.1%

Categories